Supercharge your skills with Microsoft CLX tracks for Azure

The Microsoft Azure Connected Learning Experience (CLX) program is expanding with three new tracks for Azure professionals. Enhance your Azure networking, Microsoft Sentinel in Azure, and Windows Server migration skills your way with these personalized and self-paced courses that help you learn on your own time as efficiently and effectively as possible. At the end of each course, you’ll walk away with the knowledge and skills to boost your cloud computing career.

What are the new tracks?

With data volumes growing every year, it’s never been more important to make sure that your Microsoft Azure systems are connected and secure—and that’s why our new CLX tracks are designed to strengthen your Azure Networking, Windows Server migration, and Microsoft Sentinel in Azure skills. From mitigating threats with Sentinel to managing your Windows Server workloads on Azure to uncovering new insights with a connected suite of Azure resources, our new CLX tracks can help you learn the fundamentals of cloud-native security and Azure connectivity so you can protect and connect your mission-critical systems.

New course AttendeesCourse contentAZ-700: Designing and Implementing Microsoft Azure Networking Solutions  Azure network engineersLearn how to design and implement a secure network infrastructure in Azure and how to establish hybrid connectivity, routing, private access to Azure services, and monitoring in Azure. In this course, you’ll learn the ins and outs of Azure networking. Practice designing and implementing core networking infrastructure, application delivery services, and private access to Azure services, and design, implement, and manage connectivity services. You’ll also learn to secure network connectivity to Azure resources—and though this course is meant for Azure network engineers, anyone with an interest in Azure networking can enroll.MS-Sentinel: Mitigate threats using Microsoft Sentinel in Azure (Part of SC-200)*Note: This track is not a certification course, but instead is a part of the SC-200 course. Because of this, learners are not eligible to receive a Microsoft certification exam voucher for completing this track.Security Analyst Threat Intelligence Analyst Incident Responder Security Engineer Security Operations Center (SOC) ManagerIn this course, you’ll learn to protect your organization against threats with Microsoft Sentinel, a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) solution. You’ll practice leveraging Sentinel’s capabilities to identify and respond to threats in real time by designing and configuring a Sentinel workspace, managing Sentinel analytics rules and incidents, and much more. At the end of the course, you’ll have gained the skills you need to effectively use Sentinel to mitigate threats and protect your organization’s most mission-critical resources.Migrate and manage Windows Server workloads on AzureSystem AdministratorAzure Cloud AdministratorAzure Cloud ConsultantMigrating and managing Windows Server workloads on Azure involves transferring existing applications and infrastructure from on-premises to the Azure cloud platform. Azure offers various tools and services, such as Azure Site Recovery, Azure Virtual Machines, Azure Backup, and Azure Monitor, to support this migration and management process. This allows organizations to leverage the benefits of cloud computing while still using their familiar Windows Server workloads.

What is the CLX program?

The CLX program is a comprehensive, step-by-step learning program designed for IT professionals and other aspiring learners looking to master core concepts in cloud services. It combines self-paced interactive labs and skilling content with virtual sessions led by Microsoft tech experts to help learners like you strengthen their understanding of the latest cloud topics in a way that meets your needs and fits your schedule. The program’s unique design ensures you get exactly the skills you need as efficiently as possible, and it includes four steps:

Knowledge AssessmentThe program kicks off with a 20-question Knowledge Assessment to gauge your skills. Based on your results, it will then present you with only the course content that’s relevant to your skills and knowledge gaps—making sure you use your time as efficiently as possible in a way that fits your experience.

Interactive labsYou’ll then dive into learning modules and hands-on, interactive online labs that mirror what you’ll experience in the professional world—helping you learn efficiently and effectively. The interactive labs are available on demand and can be used as many times as needed.

Virtual sessionAfter finishing the interactive labs, you can choose to attend a virtual session led by Microsoft-certified trainers that dive deeply into the course content. Led by Microsoft-certified trainers (MCTs), these group sessions feature live discussions, insights, and practical guidance, and you can ask follow-up questions to get real-time help alongside peers and pros. To make sure you get the skills you need wherever you’re located, the two-to-four-hour sessions are held regularly in three time zones—Australian Easter Daylight Time (AEST), Greenwich Mean Time (GMT), and Pacific Daylight Time (PDT).

Practice testAt the end of your CLX journey, you’ll take an 80-question practice test that helps you assess your learning and prepare you for your final Microsoft certification exam. This two-hour test is very similar to the final exam, helping you accurately test your understanding and target areas for improvement.Once you’ve finished the program’s four steps, you’ll walk away with the knowledge and skills to help you excel in the world of Azure. You’ll also receive a 50% discount voucher for the Microsoft Azure Certification exam (this is a voucher only available with certain courses), so you can prove your skills and advance your cloud computing career.

How can I sign up?

Check out our Microsoft CloudEvents Portal and our Microsoft Azure CLX introductory video to learn more and sign up for the CLX program. You can also read about many other courses in CLX that are designed to boost your Azure skills in our previous blog, and you can check out our CLX AI in Azure skilling content here.
The post Supercharge your skills with Microsoft CLX tracks for Azure appeared first on Azure Blog.
Quelle: Azure

Develop Microsoft Azure skills with 30 Days to Learn It

In today’s rapidly evolving job market, having a competitive edge is essential. With technology advancing at an unprecedented pace, it’s becoming increasingly important to keep up with the latest trends and tools. The Microsoft Cloud Skills Challenge 30 Days to Learn It program is aimed at helping individuals develop proficiency in the most in-demand skills in the tech industry, such as learning the Microsoft Azure platform. The program offers various learning journeys designed for participants with different levels of experience and technical backgrounds.

The challenge is open to IT professionals and developers of all skill levels and is designed to provide a flexible and accessible way to learn new skills and advance their careers. To participate, individuals simply need to sign up for the challenge on the Microsoft Learn platform and begin completing the available learning modules.

Completing any challenge within 30 days can make individuals eligible for a 50 percent discount on a Microsoft Certification exam. These certifications can boost a participant’s career prospects and earning potential and are widely recognized in the tech industry.

Our results speak for themselves: Upon completion of the program, 77 percent of professionals get a promotion, land a new role, or enter technical programs after becoming certified. Additionally, 74 percent of surveyed professionals got more autonomy in their jobs, and 35 percent of candidates who completed a certification received a salary increase of 30 percent or more.

Microsoft Azure 30 Days to Learn It program

In this blog, we will focus on Microsoft Azure-based 30 Days to Learn It challenges, though a variety of learning paths exist, including Microsoft 365, Dynamics 365, and Microsoft Power Platform. Microsoft Azure is a cloud computing platform that offers a wide range of services, including compute, storage, networking, and AI. It is a popular choice for developers because it is scalable, reliable, and secure. Azure can help developers build and deploy applications faster and more easily, and it can help them save money on infrastructure costs.

Check out our full lineup of challenges to get started on your career advancement today:

ToolTraining DescriptionAzure Network EngineerGain expertise in planning, implementing, and maintaining Azure networking solutions, including hybrid networking, connectivity, routing, security, and private access to Azure services. This learning journey is designed for participants who have experience with networking concepts and connectivity methods.Azure Database AdministratorBegin learning how to manage the operational aspects of cloud-native and hybrid data platform solutions built with Microsoft SQL Server and Microsoft Azure Data Services. This learning journey is designed to equip participants with a variety of methods and tools to perform day-to-day operations at an introductory level.Windows Server Hybrid AdministratorGain expertise in configuring and managing Windows Server on-premises, hybrid, and infrastructure as a service (IaaS) platform workloads. This learning journey is designed for participants who administer core and advanced Windows Server workloads and services using on-premises, hybrid, and cloud technologies.Azure Virtual Desktop FundamentalsGain the ability to plan the architecture for an Azure Virtual Desktop deployment, manage access and security, manage user environments and apps, and monitor and maintain an Azure Virtual Desktop environment.Azure AI FundamentalsGet a solid foundation in machine learning and AI concepts including computer vision, natural language processing, and conversational AI. This learning journey is designed for participants with both technical and non-technical backgrounds.Azure DeveloperParticipate in all phases of cloud development to create end-to-end solutions. Design, build, test, and maintain cloud applications using compute, storage, management, and security services on Azure.Azure Data ScientistDesign and implement a data science solution on Azure. Learn how to build machine learning models, no-code predictive models, and machine learning solutions, and run data science workloads in the cloud.Azure Data FundamentalsGain a foundation in core database concepts in cloud environments and data services including relational data, non-relational data, big data, and analytics. This learning journey is designed for participants with both technical and non-technical backgrounds.Azure Cosmos DB DeveloperStart developing modern applications in the cloud with a fully managed, NoSQL database. This learning journey is designed for participants with experience designing, implementing, and monitoring cloud-native applications that store and manage data.Designing Azure Infrastructure SolutionsDesign cloud and hybrid solutions that run on Microsoft Azure, including compute, network, storage, monitoring, and security.Java on Azure DeveloperStart developing modern Java applications in the cloud with managed compute, databases, and DevOps services. This learning journey is designed for participants with familiarity developing and running Java applications and beginner-level experience with cloud infrastructure.DevOps EngineerDesign and implement DevOps processes and practices. Develop an instrumentation strategy with logging, telemetry, and monitoring. Manage source control with GitHub to foster collaboration and automate build and deployment processes.Azure Synapse AnalyticsTake your data analytics skills to the next level with Azure Synapse Analytics. Learn how to integrate, transform, and consolidate data from various structured and unstructured data systems into structures that are suitable for building analytics solutions.

The Microsoft Cloud Skills Challenge 30 Days to Learn It is a free, self-paced learning program that can help developers learn Microsft Azure, earn a Microsoft certification, get hands-on experience, and connect with other developers. The program covers a wide range of Azure topics, including Azure App Service, Azure Data Factory, and Azure Machine Learning. It also includes hands-on labs and a community forum, which can help developers learn the skills they need to build and deploy cloud-based applications, gain confidence, and build relationships.

Learn more at our 30 Days to Learn It website.
The post Develop Microsoft Azure skills with 30 Days to Learn It appeared first on Azure Blog.
Quelle: Azure

How Microsoft Cloud is embracing FinOps practitioners

In February 2023, I announced that Microsoft joined the FinOps Foundation as a premier member and outlined five areas we planned to explore and invest in:

Defining specifications and evolving best practices.

Aligning our collective guidance.

Improving our products and services.

Advancing training and certification programs.

Engaging with the community.

We’ve been busy over the past four months and with FinOps being top of mind as several of us land in San Diego for the second annual FinOps X conference that’s kicking off later today, I wanted to share some of that progress with you.

Helping FinOps practitioners learn and grow

While we’ve been dedicated to helping people manage and optimize their costs for years, this month is especially pivotal for FinOps practitioners managing costs in Microsoft Cloud.

First up is a new FinOps solutions page to help centralize the various resources available today and in the future. Whether you’re new to FinOps or you have years of experience, this page will connect you to Microsoft solutions that can empower your organization to efficiently adopt FinOps best practices.

One of the resources you’ll find on the FinOps solutions page is a new FinOps with Azure e-book. This 20-page e-book is a great resource for those new to FinOps in Azure. The e-book will guide you through the FinOps principles to highlight Microsoft solutions that can support your cloud journey and help your organization maximize the cloud business value.

When you’re ready to dig into the next level of detail and start your FinOps journey, check out the new FinOps documentation. If you’re new, you’ll get an introduction to FinOps and the FinOps Framework with a guide for how to approach each iteration through the FinOps lifecycle. And whether you’re new or experienced, you’ll also find guides that help you understand and implement each capability in Azure.

You’ll also find newly updated Microsoft Cost Management documentation, which has been reorganized to help you navigate and find related guidance quicker and easier than ever. Whether you’re looking for reporting and analytics, monitoring, optimization, cost allocation, or automation and extensibility, everything you need to implement your FinOps capabilities in Microsoft Cost Management has been restructured for ease of use.

These same changes are also coming to the Azure Portal, where you’ll find a new Cost Management menu that organizes capabilities into these same groups, again making it easier than ever to navigate to the capabilities you need to drive your FinOps practice.

Enable the new streamlined menu yourself from try preview in Cost Management and let us know what you’d like to see next.

Helping FinOps practitioners get answers faster

Everyone loves new guidance that teaches you how to solve complex problems or streamline how you work, but why stop there? Just like documentation can guide your learning experience, our vision of Cost Management is to guide you through your data, helping you discover deeper insights into costs and deliver answers quicker than ever before. This starts with significant performance improvements within the Cost analysis experience. Many have already noticed considerably faster load times over the last few months. Expect to see even more improvements in the coming months.

Beyond performance, you may have also noticed some major improvements to the new Cost analysis experience. Everyone’s familiar with the customizable charts in Cost analysis, but many are still new to smart views in Cost analysis. Smart views add intelligent insights about your costs like anomaly detection, more flexible download options, and the ability to multi-task and explore multiple perspectives of your cost at the same time. You can take a quick peek into cost details by expanding any row—and now, for the first time, you can also drill down into cost details with one click.

Drill down in smart views improves the classic filtering in customizable views by allowing you to drill into the data you’re looking at without needing to manually look up values in a filter on the side of the page, which interrupts your flow. Simply click the item you’re interested in, and you’ll switch to a view that gives you the next level of detail. Or perhaps you’re interested in related costs. Select the context menu (three dots) next to the name and choose from the related attributes to find resources of the same type or in the same resource group, for instance. And if you need to manage the resource, simply select Go to > Resource from the menu. After you’ve drilled into your costs, you can select the Customize command at the top to remove filters or simply select the Back command on that view to undo the last change. Enable drill down yourself from try preview in Cost Management and let us know what you’d like to see next.

Looking forward, smart views will also provide quick access to your Cost Management AI assistant. The assistant will help you perform quick analyses, provide insights, and offer recommendations to better understand, analyze, manage, and forecast your cloud costs.

But whether you use smart views or customizable views, the new Cost analysis experience remembers what you used and offers quick access to views you used recently or pinned to the top of the list. Soon, you’ll also see support for customizable views directly within the tabbed experience, making it easier to switch between your customizable, saved, and smart views.

Helping FinOps practitioners scale their efforts

Many practitioners can meet their basic needs with the native Cost Management experiences in the Azure Portal, but there are times when you need more. Maybe you’re looking to automate onboarding and setup. Or perhaps you need more advanced reporting merged with business data. Or maybe you have a more complex cost allocation or chargeback strategy that requires integrating with other systems. Whatever your scenario is, the FinOps toolkit open-source project seeks to offer solutions that help you automate and extend native cloud capabilities to meet the common needs of the FinOps community. The toolkit is an exploratory community-driven project with contributors across the globe and includes learnings from Microsoft architects and engineers in the field to help you get off the ground quicker with:

Starter kits that help you get started with cost management and optimization.

Automation scripts to streamline cost configuration and management at scale.

Advanced solutions to facilitate building custom solutions.

As an example, many people who discover Cost Management scheduled alerts or anomaly detection are immediately interested in automating setup across all subscriptions. To streamline automation, we contributed code to the FinOps toolkit and published the bicep modules for scheduled actions in the official Bicep Registry. Whether you’re creating a scheduled alert for a resource group or a subscription cost anomaly alert, adding a reference to the Bicep Registry module is quick and easy:

Looking beyond automation, another example is when organizations want to build custom reporting or optimization solutions. Custom solutions always start with data ingestion, but most organizations don’t realize the complexities introduced by nuances within different billing systems or managing data at scale. The FinOps toolkit includes a data pipeline code sample that ingests cost data into a central data store and includes pre-built Power BI reports to get you started quickly. The solution was designed and implemented by Brett Wilson (Principal Cloud Solution Architect) and Anthony Romano (Senior Consultant) to enable faster, more reliable reporting against multiple billing accounts, subscriptions, and resource groups and normalizes cost data to the FinOps Open Cost and Usage Specification (FOCUS) schema.

If you’re not familiar with FOCUS, it’s an open specification for billing data that we’re leading with the FinOps Foundation. We’re partnering with many practitioners, vendors, and cloud providers to deliver a draft specification we can all adopt. You’ll hear more about this at FinOps X this week. There’s a lot to cover, so I’ll share more details in a future blog post.

Stay tuned for more updates about the FinOps toolkit. There are many solutions the community would like to share and we’re always eager to hear from you about what you’d like to see next.

Helping FinOps practitioners drive efficiency

We’re always looking for ways to help organizations optimize their cloud usage and rates to drive efficiencies that ultimately result in more business value. (This focus on business value might be my favorite aspect of FinOps.) And as the heart of all optimization in Azure, you’ll usually see these show up in Azure Advisor first, like the ability to customize the lookback period for right-sizing recommendations, which gives you more flexibility to improve the accuracy of cost recommendations, or the ability to automate cost savings with Resource Graph.

But also keep in mind that not all optimization and efficiency opportunities are ahead of us—some are behind us. As FinOps practitioners, you’re of course aware of commitment-based discounts, like reservations and savings plans. It’s perhaps the most discussed way to optimize your costs by committing to specific usage or spending thresholds. But one part of that story that sometimes goes untold is monitoring your reservations. You can now configure reservation utilization alerts in Cost Management to stay informed when your reservation utilization drops below an acceptable threshold. You can configure reservation utilization and anomaly alerts from the new alert rules page in Cost Management.

And last, but not least, I’d like to share a solution that saves you time as well as money! As part of the FinOps toolkit, you also have a new cost optimization workbook that centralizes some of the most used tools to help you drive your utilization and efficiency goals, like Advisor recommendations, Hybrid Benefit, and more. The cost optimization workbook was designed and implemented by Arthur Clares (Senior Cloud Solution Architect) and Seif Bassem (Senior Cloud Solution Architect) and is based on the Well-Architected Framework and feedback we’ve heard from working directly with organizations driving their own FinOps practices.

What’s Next for FinOps?

I’ve just walked you through a sampling of the things we’ve been working on since joining the FinOps Foundation earlier this year. This is only a taste of what’s to come. Looking ahead, you can expect to see:

Even more FinOps learning resources along with more comprehensive documentation that helps you drive FinOps maturity.

Our portal experiences will continue to improve as we integrate more intelligent systems that surface insights more naturally and streamline your workflow.

We’ll continue to evolve and expand the FinOps toolkit to cover more scenarios, like FOCUS adoption, in collaboration with the open-source community.

New and updated tools and services that help you save money and drive efficiency with Microsoft Cloud.

And, if you’re at FinOps X this week, please catch us in our sessions where we’ll share tips on how to manage cost data at scale on Wednesday and how to implement FinOps capabilities in the Microsoft Cloud on Thursday. We’ll also be in the FOCUS and Technical Advisory Council sessions as well as at our booth! We look forward to catching you there or in the FinOps Slack community!

And if that wasn’t enough, here’s one last tip: Stay informed with all FinOps updates with the new FinOps tag on the Azure blog!

All the best from your friendly FinOps ambassadors at Microsoft!
The post How Microsoft Cloud is embracing FinOps practitioners appeared first on Azure Blog.
Quelle: Azure

What’s new in Azure Data & AI: How customers realize tangible ROI with the industry leading AI platform

Last month at Microsoft Build 2023, we unveiled new capabilities and a copilot framework to empower organizations to achieve more with generative AI using their own data or data from a third-party they have access to. From Azure OpenAI Service to Microsoft Fabric, it’s clear this technology can accelerate innovation among builders of all skill levels, increasing the value and relevance of big data through the power of natural language.

Amidst all the announcements at Microsoft Build 2023, I also value the focus our Chief Technology Officer Kevin Scott placed on product fundamentals. Yes, generative AI is exciting and enables meaningful advances in application development, speed, and cost efficiency, but that “does not absolve any of us of the responsibility of thinking about what good product-making looks like.” Like every platform shift before, generative AI will impact how builders build. It should not change why we build: to solve real-world problems.

I recently explored the context and impact of the monumental platform shift we are facing in the Wall Street Journal, followed by my thoughts on how leading companies are using generative AI to drive business value and how other organizations can assess and plan for AI success. These articles were the culmination of countless conversations with customers and partners that recognize AI readiness as a continuous journey. And since we’re in the early days of this new platform shift, learning from peers is essential as we embark on the journey together.

In this month’s blog, I’ll highlight recent product updates and spotlight great customers and partners who are putting Azure to work to solve real problems in innovative ways. I hope these examples inspire you to put Azure to work at your organization. If you are a Microsoft partner, please join us July 18 to 19, 2023 at Microsoft Inspire for in-depth learning on how you can drive customer success with the latest innovations from Azure.

Customers expanding productivity, accessibility, and innovation with Azure AI

A quick reminder: Microsoft runs on Azure AI. It’s a point of pride for our company and an incredible pressure test that translates to accelerated learning and industry-leading innovation on behalf of our partners and customers. Microsoft was recently recognized as a Leader for the 4th year in a row in the 2023 Gartner® Magic Quadrant™ for Cloud AI Developer Services, placing furthest for Completeness of Vision. This month, a new Total Economic Impact™ (TEI) study conducted by Forrester Consulting, also revealed that Azure AI customers have collectively witnessed a 284 percent ROI over 3 years.  We are committed to making Azure AI the trusted AI platform for every developer to invent with purpose, whether extending a Microsoft solution or building something net new from the ground up. 

San Raffaele University and Research Hospital transforms clinical research and delivers precision medicine using AI

San Raffaele University and Research Hospital is one of the largest of its kind in Italy, as well as a pioneer of cutting-edge translational research and technology adoption at both a national and international level. During COVID-19, the organization saw an opportunity to start using AI to better triage incoming symptomatic patients and identify their future reaction to the virus. Fast forward to today, the pilot project has been turned into an intelligent data solution powered by AI—which San Raffaele is using to transform clinical research, improve the end-to-end patient journey inside the hospital and accelerate its own move towards precision medicine.

Reddit improves accessibility and SEO through Azure Cognitive Service for vision image and caption generation

Reddit is a community of communities where millions of users across the globe find and share images and other content around their interests, hobbies, and passions. The company sought to enhance the search engine optimization (SEO) of its images in an effort to broaden the site’s accessibility, particularly for users who are blind or have low vision. Using AI, Reddit was able to quickly generate accurate alt text for its vast catalogue of images; applying their solution to existing images on the platform as well as the thousands of new images added daily.

ERM combats corporate ‘greenwashing’ with ESG assessment tool powered by Azure Machine Learning

In pursuit of helping companies around the world achieve net-zero carbon emissions, ERM, the largest global pure play sustainability consultancy, has built a standardized software-as-a-service (SaaS) tool that can rate companies based on their environmental, social, and governance (ESG) risks for private capital investors. ESG Fusion combines machine learning and subject matter expertise to provide peer reviewed ESG reports to ERM clients. Powered by Microsoft Azure Machine Learning, ESG Fusion can provide a comprehensive assessment of a company’s ESG risks and opportunities within two business days—a big step in promoting sustainable business practices around the globe.

Customers modernizing their data to fuel advanced analytics and intelligent applications

Powering organization-specific AI experiences requires a constant supply of clean data from a well-managed and highly integrated analytics system. One of our biggest announcements at Microsoft Build 2023 last month was Microsoft Fabric, an end-to-end, unified analytics platform that brings together all the data and analytics tools that organizations need to bring their data into era of AI. With Fabric, customers don’t need to piece together different data services from multiple vendors. Instead, they can enjoy a highly integrated and easy-to-use product designed to simplify their analytics needs—complete with a single purchase model allowing customers to use their analytics budget for the components of the analytics value chain they are most focused on. Learn why customers like Ferguson, T-Mobile, and Aon are excited about Microsoft Fabric.

American Airlines gains speed and boosts reliability by flying real-time data to Azure SQL

American Airlines wanted to move its real-time customer data domain from an aging on-premises datacenter to the cloud. After evaluating several possible routes, the airline’s Customer Hub team chose Microsoft Azure SQL because it satisfied every one of their demanding requirements. The complex process unfolded in multiple phases to shift both data and apps to Azure, giving them greater flexibility, automated maintenance, and alignment with the airline’s transformation goals so that American Airlines can deploy solutions to customers faster.

Rockwell Automation transforms industrial business with Azure Cosmos DB

Rockwell Automation, a leader in the industrial automation space, embarked on a digital transformation journey to offer their customers a more seamless and productive SaaS platform. This WIRED article explores how Rockwell set out to deliver a cloud-based, collaborative data environment that is available anywhere, anytime, for both their employees and customers. They used Azure Cosmos DB and Azure Kubernetes Service to scale automatically and keep costs in line with growth.

Rx Health makes sense of data from millions of health wearables with Azure Cosmos DB

Managing terabytes of wearable-generated health data is a difficult task without the right tools. Azure Cosmos DB is helping developers of medical Internet of Things (IoT) applications bridge the gap between the structured world of health record systems and the more dynamic world of non-relational databases and IoT. For example, Rx Health, an automated care coordination company, uses sensor-based technologies and Azure Cosmos DB to power a heart failure program which monitors congestive heart failure patients at home and provides educational and motivational messages from medical professionals directly to their mobile devices.

Xbox combines AI and Azure Cosmos DB to provide better gaming experiences 

Whether it’s generating more realistic opponents or more helpful allies, AI is changing the way games are played. In a recent video by Ars Technica, Haiyan Zhang, General Manager of Gaming AI at Xbox, highlighted how AI and real-time player and game data is being used to balance and improve game play. For content safety, Xbox uses Watch For, a Microsoft media AI platform that analyzes images, videos, live steams, and other media content in real-time and is powered by Azure Cosmos DB. Using Azure Cosmos DB, Watch For processes hundreds of millions of requests and billions of classifications each month for safer game play.

Aware propels platform performance and enhanced customer experiences with Azure Cosmos DB

Aware is an industry leader in contextual intelligence, transforming digital voices at scale into actionable organizational insights that help business leaders address the growing unstructured dataset of enterprise collaboration tools. To manage the vast amounts of data its platform needs to ingest, the company turned to Microsoft Azure Cosmos DB to augment and support its traditional Azure Database for PostgreSQL architecture. The company now has fewer issues with large content data, better platform performance, reduced costs, and a more consistent experience for its customers.

Dentsu builds an Azure-supported analytics platform to modernize data across the organization

Dentsu is a marketing and advertising network dedicated to brand partnership that delivers meaningful progress and growth through unique and innovative media campaigns and bespoke data-driven solutions. The global organization spans more than 145 markets with 65,000 employees and serves 11,000 clients. To accelerate business insights across the organization, Dentsu invested in a modern data analytics effort that would take their business transformation to the next level. Now, Dentsu has a stronger data culture, supported by a reporting and analytics ecosystem that users can trust for business-critical decisions.

Swedbank migrates big data platform to Azure Cloud for enhanced security and scalability

Swedbank is a multinational bank with a 200-year history. Recently, the company took the bold decision to migrate its big data platform to the cloud in search of greater scalability and security. With Azure Databricks functioning at the core of this solution, the successful migration has brought the bank a wealth of benefits. From reduced time-to-market and cost savings to significant improvements in its ability to detect fraud—it has set the bank and its 7.7 million customers on course for a more secure future.

Swedbank migrates big data platform to Azure Cloud for enhanced security and scalability

Swedbank is a multinational bank with a 200-year history. Recently, the company took the bold decision to migrate its big data platform to the cloud in search of greater scalability and security. Leveraging the power of Azure Databricks, the successful migration has brought the bank a wealth of benefits. From reduced time-to-market and cost savings to significant improvements in its ability to detect fraud—it has set the bank and its 7.7 million customers on course for a more secure future.

Recent product announcements from Azure Data and AI

In case you missed it: we had a lot of announcements at Microsoft Build 2023 in May! You can see the full rundown in our Microsoft Build 2023 Book of News. To close this month’s blog post, I’ll highlight some other recent announcements to support your development of even more intelligent applications.

Azure OpenAI Service on your data is now available in public preview. This feature allows you to harness the power of OpenAI models, such as ChatGPT and GPT-4, with your own data in the Azure AI Studio. This revolutionizes the way you interact with and analyze your data, providing greater accuracy, speed, and valuable insights.

Custom Text Analytics for health is now in public preview, available as part of the Azure Cognitive Service for Language. This feature extends the capabilities of Text Analytics for health, a pre-built natural language processing (NLP) solution for extracting and labelling medical information from unstructured medical text. Now, customers can build their own healthcare entity extraction model using data from specific medical domains or unique organizational data such as abbreviations.

Accelerate content discovery and insight extraction for video and audio files using ChatGPT and Azure OpenAI service with Azure Video Indexer. This feature will allow you to enable deep search on video archives and can be especially useful for organizations that need to quickly review large volumes of video content for specific information. We listened to customer feedback and lowered the price on our most popular presets by 40 percent while also adding new advanced presets to improve and add more AI capabilities. Soon, you will be able to unlock content discovery and insights anywhere including on-premises, with the Azure Arc enabled extension.

Pronunciation Assessment is now generally available in 19 languages in Azure Cognitive Service for Speech. This update extends support from English to 18 additional languages and quality enhancements to existing features, including accuracy, fluency, and intonation assessment. This feature supports comprehensive language learning solution for learners and educators worldwide, enabling learners to receive detailed feedback on their pronunciation skills and identify areas where they need improvement. Berlitz and HelloTalk are both utilizing pronunciation assessment to help facilitate language learning on a global scale.

What’s next?

Thank you for reading and we’ll see you again next month. If you’re a partner, I hope you will join us July 18 to 19, 2023 at Microsoft Inspire for more announcements, case studies, and expert guidance tailored to help your business succeed with Microsoft.

Gartner Magic Quadrant for Cloud AI Developer Services, Jim Scheibmeir, Svetlana Sicular, Arun Batchu, Mike Fang, Van Baker, Frank O’Connor, 22 May 2023. Gartner is a registered trademark and service mark and Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
The post What’s new in Azure Data & AI: How customers realize tangible ROI with the industry leading AI platform appeared first on Azure Blog.
Quelle: Azure

Healthcare revolution with Microsoft Azure: An AI wellness check  

Generative AI has emerged as a game-changer in various industries, and healthcare is no exception. With its ability to generate new content, models, and insights, generative AI has the potential to revolutionize medical research, diagnosis, treatment, and patient care by allowing healthcare providers to increase productivity with less administrative burden while shifting their focus to patient care. 

Azure OpenAI Service and Epic’s EHR software 

In April 2023, one of the largest electronic health records providers, Epic, used Azure OpenAI Service to integrate large language model tools and AI into its electronic health record software. Epic currently has the largest share of acute care hospitals in the U.S. market: Globally, about 2,130 hospitals use Epic for its medical records software and more than 305 million patients have a current electronic record in Epic. 

One of the integration’s tools, which helps automatically draft message responses, is already being implemented at UC San Diego Health, UW Health in Madison, Wisconsin, and Stanford Health Care.   

“The urgent and critical challenges facing healthcare systems and their providers demand a comprehensive approach combining Azure OpenAI Service with Epic’s industry-leading technology.”
Eric Boyd, Corporate Vice President, AI Platform, Microsoft.

Epic’s exploration of OpenAI’s GPT-4 with Azure OpenAI Service has shown the potential to increase the power and accessibility of self-service reporting. By doing so it’s made it easier for healthcare organizations and their providers to identify operational improvements, including ways to reduce costs and to find answers to questions both locally and within a broader context. And that’s just the beginning. 

AI influencing the future of healthcare

Below we look at more ways that AI is expected to help the healthcare industry pave the way to wellness:

Accelerating drug discovery: One of the most time-consuming and costly aspects of healthcare is the discovery and development of new drugs. One way to speed up the drug-development process is through computational modeling so that molecules can be prioritized in silico without being physically available, and only the molecules most likely to succeed are synthesized and measured. To enable such a speedup through computational modeling, a machine learning model must be able to precisely predict molecular properties, and whether a proposed drug molecule will be able to affect the protein target associated with the disease. Machine learning is highly effective at recognizing patterns in images and text, where millions of lines of such data are available, thereby accelerating the time it takes to produce drugs that can successfully combat illness.

Enhancing medical imaging analysis: Medical imaging plays a crucial role in supporting the diagnosis, treatment, and monitoring of various diseases. Project InnerEye from Microsoft Research is developing machine learning and open-source software to empower healthcare organizations and innovators to develop their own solutions that can assist clinicians in planning radiotherapy treatments so that they can spend more time with their patients. Cambridge University Hospitals NHS Foundation Trust is one of the early adopters to use open-source technology from Project InnerEye, creating an Azure-based medical AI tool called OSAIRIS that reduces the amount of time cancer patients wait for radiotherapy treatment. Working alongside OSAIRIS, the specialist can plan radiotherapy treatments about two and half times faster than working alone, ensuring that more patients who need treatment can get it sooner and thus improving the likelihood of better outcomes.

Supporting pathologists, and their patients, around the globe: PathPresenter, in collaboration with Microsoft, has focused on ensuring seamless interoperability between scanners, image management software, AI models, and hospital infrastructure, to reduce the reporting burden on pathologists and accelerate the adoption of digital workflows by pathologists and institutions worldwide for the benefit of patients and society. Microsoft Azure offers unique solutions in digital pathology that are scalable and can serve digital pathology images several gigabytes in size—even on poor internet connections encountered in remote regions. 

Equip more clinicians with medical imaging AI: AI has the potential to revolutionize medical imaging by enabling faster and more accurate analysis of imaging data, leading to better patient outcomes. Together, Nuance + Microsoft, and NVIDIA are working to simplify the translation of imaging AI models into existing and trusted clinical applications that can deliver genuine benefits for everyday patient care without requiring providers to change their workflows or their underlying IT systems.

Improving COVID-19 clinical decision support: Providence, a healthcare system with 51 hospitals and more than 1,085 clinics, is headquartered in Renton, Washington, near the epicenter of the first major US COVID-19 outbreak. Providence used the existing Microsoft Azure Health Bot service and configured it to create an AI-based tool to triage patients and answer their questions specifically about COVID-19 symptoms, freeing providers to attend to the patients who needed it most. The Azure Health Bot with COVID-19 templates that Providence deployed has since been adopted by several thousand healthcare providers, the US Centers for Disease Control and Prevention (CDC), NGOs (non-governmental organizations), and international health authorities.

Scalable Data Storage: Health First was one of the first healthcare providers to deploy WhereScape with Azure Synapse Analytics. Implementing both familiar and new products helped the network to accelerate its operations. With faster turnaround times, Health First employees could focus on using data to improve patient care and operational decisions. Health First experienced more than a 90 percent improvement in workload processing times. With Azure Synapse Analytics and Power BI, the daily data refresh was about 75 percent faster, down to 3 hours from a 12-hour overall run time, which improved the company’s capability to provide actionable insights for both clinical and operational decisions.

As the use of AI in healthcare continues to evolve, we anticipate a future where healthcare systems are increasingly able to handle more challenging cases and discover solutions to some of the most pressing healthcare issues facing individuals and communities worldwide today. Microsoft’s innovative initiatives in this space highlight the immense potential of AI. We’re looking forward to helping accelerate drug discoveries, enhance medical imaging analysis, improve clinical decision making, and scale to better support operational decisions. 

Our commitment to responsible AI

Microsoft has a layered approach for generative models, guided by Microsoft’s responsible AI principles. In Azure OpenAI Service, an integrated safety system provides protection from undesirable inputs and outputs and monitors for misuse. In addition, Microsoft provides guidance and best practices for customers to responsibly build applications using these models and expects customers to comply with the Azure OpenAI Code of Conduct. With GPT-4, new research advances from OpenAI have enabled an additional layer of protection. Guided by human feedback, safety is built directly into the GPT-4 model, which enables the model to be more effective at handling harmful inputs, thereby reducing the likelihood that the model will generate a harmful response. 

Get started with Azure OpenAI Service 

Apply for access to Azure OpenAI Service by completing this form. 

Learn about Azure OpenAI Service and the latest enhancements. 

Get started with GPT-4 in Azure OpenAI Service in Microsoft Learn. 

Read our Partner announcement blog, Empowering partners to develop AI-powered apps and experiences with ChatGPT in Azure OpenAI Service. 

Learn how to use the new Chat Completions API (preview) and model versions for ChatGPT and GPT-4 models in Azure OpenAI Service.  

Provide feedback Let us know what you think of Azure and what you would like to see in the future.

Build your cloud computing and Azure skills with free courses by Microsoft Learn.

The post Healthcare revolution with Microsoft Azure: An AI wellness check   appeared first on Azure Blog.
Quelle: Azure

Explore the benefits of Azure OpenAI Service with Microsoft Learn

As the field of AI continues to grow, developers are constantly seeking new and innovative ways to integrate it into their work. With the launch of Azure OpenAI Service, developers now have even more tools at their disposal to take advantage of this powerful technology. Azure OpenAI Service can be used to create chatbots, generate text, translate languages, and write different kinds of creative content. As the platform continues to evolve, developers will be able to use it to build even more powerful and sophisticated applications.

In this article, we will explore the benefits of utilizing OpenAI and Azure OpenAI Service for developers, as well as the various learning paths Microsoft has established to get you started.

Introduction to OpenAI and Azure OpenAI Service

Azure OpenAI Service is a fully managed service that allows developers to easily integrate OpenAI models into their applications. With Azure OpenAI Service, developers can quickly and easily access a wide range of AI models, including natural language processing, computer vision, and more. Azure OpenAI Service provides a simple and easy-to-use API that makes it easy to get started with AI.

For developers who may not have experience with Azure OpenAI Service, here are some ways it can help:

Simplified integration: A simple and easy-to-use API offers various endpoints that can be used for different tasks, such as text generation, summarization, sentiment analysis, language translation, and more.

Pre-trained models: Already fine-tuned on vast amounts of data, these pre-trained models make it easier for developers to leverage the power of AI without having to train their own models from scratch.

Customization: Developers can also fine-tune the included pre-trained models with their own data with minimal coding, providing an opportunity to create more personalized and specialized AI applications.

Documentation and resources: Azure OpenAI Service provides comprehensive documentation and resources that can help developers get started quickly, including tutorials, guides, and code samples that cover various use cases and scenarios.

Community support: With an active community willing to help and share their experiences via forums and support channels, developers can ask questions, seek guidance, and learn from others.

Scalability and reliability: Hosted on Microsoft Azure, the service provides robust scalability and reliability that developers can leverage to deploy their AI applications with confidence, without having to worry about managing the underlying infrastructure.

Responsible AI: Azure OpenAI Service promotes responsible AI by adhering to ethical principles, providing explainability tools, governance features, diversity and inclusion support, and collaboration opportunities. These measures help ensure that AI models are unbiased, explainable, trustworthy, and used in a responsible and compliant manner.

Learning paths for developers to adopt Azure OpenAI Service

With the rapid adoption and far-reaching possibilities of the recent AI boom, there isn’t a better time to start your skilling journey and stay ahead in this competitive field of technology. Getting started with Azure OpenAI Service is easy, thanks to the newest Microsoft learning path, including a range of resources, including tutorials, documentation, and sample code, to help developers get up to speed with the platform.

Azure OpenAI Service Learning Path – Develop AI solutions with Azure OpenAI ModuleSummaryLearning objectivesGet started with Azure OpenAI ServiceThis module provides engineers with the skills to begin building an Azure OpenAI Service solution.Create an Azure OpenAI Service resource and understand types of Azure OpenAI base models.Use the Azure OpenAI Studio, console, or REST API to deploy a base model and test it in the Studio’s playgrounds.Generate completions to prompts and begin to manage model parameters.Build natural language solutions with Azure OpenAI ServiceThis module provides engineers with the skills to begin building apps that integrate with the Azure OpenAI Service.Integrate Azure OpenAI into your application.Differentiate between different endpoints available to your application.Generate completions to prompts using the REST API and language specific SDKs.Apply prompt engineering with Azure OpenAI servicePrompt engineering in Azure OpenAI is a technique that involves designing prompts for natural language processing models. This process improves accuracy and relevancy in responses, optimizing the performance of the model.Understand the concept of prompt engineering and its role in optimizing Azure OpenAI models’ performance.Know how to design and optimize prompts to better utilize AI models.Include clear instructions, request output composition, and use contextual content to improve the quality of the model’s responses.Generate images with Azure OpenAI ServiceAzure OpenAI Service includes the DALL-E model, which you can use to generate original images based on natural language prompts.Describe the capabilities of DALL-E in the Azure OpenAI Service.Use the DALL-E playground in Azure OpenAI Studio.Use the Azure OpenAI REST interface to integrate DALL-E image generation into your apps.Generate code with Azure OpenAI ServiceThis module shows engineers how to use the Azure OpenAI Service to generate and improve code.Use natural language prompts to write code.Build unit tests and understand complex code with AI models.Generate comments and documentation for existing code.

The future of AI for developers

OpenAI and Azure OpenAI Service have the potential to revolutionize the developer experience, and with the various learning paths available developers can quickly and easily get started. With the constant development and updates being made to the platform, developers can continue to explore the capabilities of Azure OpenAI Service to see even more groundbreaking applications and innovations in the years to come.

Ready to get started with Azure OpenAI Service? Check out the official Microsoft Learn path to discover more about the platform and get hands-on experience with AI. You can also learn and develop essential AI skills with the Microsoft Learn AI Cloud Skills Challenge, which begins on July 17, 2023. Preview the topics and sign up now. With Azure OpenAI Service, the possibilities are endless!
The post Explore the benefits of Azure OpenAI Service with Microsoft Learn appeared first on Azure Blog.
Quelle: Azure

Azure HBv4 and HX Series VMs for HPC now generally available

We are excited to announce Azure HBv4 and HX-series Virtual Machines (VMs) are now generally available. With the general availability, Microsoft is offering customers the first VMs featuring the latest 4th Gen AMD EPYC™ processors with AMD 3D V-Cache™ technology (codename ‘Genoa-X’), paired with 400 Gigabit NVIDIA Quantum-2 InfiniBand. Azure HBv4 and HX-series VMs offer leadership levels of performance, scaling efficiency, and cost-effectiveness for a variety of HPC workloads such as computational fluid dynamics (CFD), financial services calculations, finite element analysis (FEA), geoscience simulations, weather simulation, rendering, quantum chemistry, and silicon design.

Compared to Azure HBv3-series VMs using 3rd Gen AMD EPYC™ processors (codename ‘Milan-X’), already the highest performance VMs for HPC workloads on the public cloud, customers will see up to:

1.6 times higher performance for rendering

2.4 times higher performance for weather simulation

2.7 times higher performance for CFD

4.2 times higher performance for molecular dynamics

5.7 times higher performance for structural analysis

Visit our technical blog for more detailed performance and scalability information and see below for a summary of performance across a diverse selection of widely used high performance computing (HPC) workloads.

Figure 1: Performance comparison summary of Azure HBv4/HX-series VMs to HBv3-series across diverse engineering and scientific computing workloads.

Azure HBv4 and HX-series VMs are available today in the Azure East United States region, and will soon come to the Azure Korea Central, South Central US, Sweden Central, and Southeast Asia regions.

Faster, more cost effective, and more power efficient technologies for HPC

Azure HBv4 and HX series VMs feature 4th Gen AMD EPYC™ processors with AMD 3D V-cache™ because they deliver the fastest levels of performance to a variety of memory performance-bound HPC workloads. The 2.3 GB L3 cache per VM can deliver up to 5.7 terabytes per second of bandwidth to amplify up to 780 gigabytes per second of bandwidth from main memory, for a market leading blended average of 1.2 terabytes per second of effective memory bandwidth across a broad range of customer workloads. For widely-used memory bandwidth-bound workloads like OpenFOAM, HBv4-series VMs with 4th Gen AMD EPYC CPUs with 3D V-Cache technology are already yielding up to 1.49 times higher performance than standard 4th Gen EPYC processors in Azure’s internal testing.

Figure 2: Performance comparison of HBv4/HX during Preview (no 3D V-Cache) versus General Availability (includes 3D V-Cache) from 1-8 VMs on CFD workload OpenFOAM.

Azure HBv4 and HX series VMs deliver these significant performance enhancements at lower cost and energy consumed per job directly proportional to the rate of workload speedup (such as 25 percent faster performance on a given HPC workload with AMD 3D V-Cache compared to standard 4th Gen AMD EPYC CPUs translates to the customer incurring 25 percent lower cost and power consumed per job).

Learn more about performance and scalability across a range of applications, models, and configurations, and visit Azure Docs for full technical specifications of HBv4 and HX series VMs.

Continuous improvement for even more HPC customers

Microsoft is delivering a new era of high-performance computing in the cloud; one defined by continuous improvements to the critical research and business workloads that matter most to our customers. Through our partnership with AMD, we’re making this vision a reality by raising the bar on the performance, scalability, and value we deliver with every release of Azure H-series family VMs for our HPC customers.

With the release of Azure HX series VMs, a new class within the Azure H-series family, we are extending our commitment to customers running large memory workloads like structural analysis and semiconductor design. The 1.4 terabytes of memory in every HX-series VM enables customers to run more of a data intensive workload out of memory for significant speedups and cost reductions. For widely used automotive simulators like MSC Nastran, Azure’s internal testing shows the new HX series VMs yield up to 5.7 times higher performance than HBv3-series VMs, and 9.2 times higher performance than HC-series VMs released 4 years ago featuring technology still widely utilized in on-premises HPC environments today.

Figure 3: Performance uplift from 2019–2023 on Azure H-series family Virtual Machines for structural analysis application MSC NASTRAN.

Customer Momentum

“Materials sciences researchers, including those working with Azure Quantum, stand to benefit greatly from the introduction of Azure HBv4 virtual machines featuring powerful new processors and networking technologies. The 4x performance increase on NAMD shows once again Microsoft’s commitment to continuously making the most advanced computing resources available through Azure cloud services.”—Nathan Baker, Senior Director, Partnerships for Chemistry and Materials, Azure Quantum.

“AMD EPYC™ processors, available through the Azure HB-series family of Virtual Machines via Ansys Cloud, have been instrumental in advancing our ability to deliver more CFD simulations in support of our Motorsports efforts in North America. We firmly believe that accelerating our simulation capabilities and the engineering insights gained from this work have been key to our recent string of successes in both IndyCar and IMSA prototypes. We are looking forward to evaluating the performance of Azure HBv4 and HX-series VMs powered by the 4th Generation of AMD EPYC™ processors with AMD 3D V-Cache™ technology in the very near future. We are confident that AMD and Azure will continue to deliver on their promise of sustainable performance to be realized in time-to and cost-of solution for our very complex CFD models with these processors.”—Kelvin Fu, Vice President at Honda Performance Development.

“With Azure high-performance computing, we can run more jobs and tasks in parallel.  We reduce the time to market by increasing the efficiency and the quality of the product that we release. This solution helps us to improve the product quality and reduce the risk of a delivery delay.”—Anna Palo, Digital Design Architect, STMicroelectronics.

Learn more

Azure Docs – HBv4 series Virtual Machines.

Azure Docs – HX series Virtual Machines.

Performance and Scalability of Azure HBv4 and HX VMs for HPC (Public Preview).

Find more Azure HPC resources.

Learn more about Azure HPC.

The post Azure HBv4 and HX Series VMs for HPC now generally available appeared first on Azure Blog.
Quelle: Azure

Deploy a holistic view of your workload with Azure Native Dynatrace Service

Microsoft and Dynatrace announced the general availability of Azure Native Dynatrace Service in August 2022. The native integration enables organizations to leverage Dynatrace as a part of their overall Microsoft Azure solution. Users can onboard easily to start monitoring their workloads by deploying and managing a Dynatrace resource on Azure.  

Azure Native integration enables you to create a Dynatrace environment like you would create any other Azure resource. One of the key advantages of this integration is the ability to seamlessly ship logs and metrics to Dynatrace. By leveraging Dynatrace OneAgent, users can also gather deeper observability data from compute resources such as virtual machines and Azure App Services. This comprehensive data collection ensures that organizations have a holistic view of their Azure workloads and can proactively identify and resolve issues. 

Furthermore, the integration unifies billing for Azure services, including Dynatrace. Users receive a single Azure bill that encompasses all the services consumed on the platform, providing a unified and convenient billing experience. 

Since its release, Dynatrace Service has seen continuous enhancements. In the following sections, we will explore some of the newer capabilities that have been added to further empower organizations in their monitoring and observability efforts. 

Automatic shipping of Azure Monitor platform metrics 

One of the significant advancements during the general availability of Azure Native Dynatrace Service was the automatic forwarding of logs from Azure Monitor to Dynatrace. The log forwarding capability allows you to configure and send Azure Monitor logs to Dynatrace. Logs start to flow to your Dynatrace environment as soon as the Dynatrace resource on Azure is deployed. The Azure experience allows you to view the summary of all the resources being monitored in your subscription. 

Building further, we have now added another key improvement and that is the ability to automatically obtain metrics from the Azure Monitor platform. This enhancement enables users to effectively view the metrics of various services within Azure on the Dynatrace portal. 

To enable metrics collection, customers can simply check a single checkbox on the Azure portal. This streamlined process makes it easy for organizations to start gathering valuable insights. For further customization, users have the option to specify tags to include or exclude specific resources for metric collection. This allows for a more targeted monitoring approach based on specific criteria.  

The setup of credentials required for the interaction between Dynatrace and Azure is automated, eliminating the need for manual configuration. Once the metrics are collected, users can conveniently view and analyze them on the Dynatrace portal, providing a comprehensive and centralized platform for monitoring and observability. 

Together with logs and metrics monitoring capabilities, Azure Native Dynatrace Service provides holistic monitoring of your Azure workloads. 

Native integration availability in new Azure regions 

During general availability, Azure Native Dynatrace Service was available in two regions, the Eastern United States and Western Europe. However, to cater to the growing demand, native integration is now available in additional regions. You can now create a Dynatrace resource in—The United Arab Emirates North (Middle East), Canada Central, and the Western United States—bringing the total number of supported regions to five. You can select the region in the resource creation experience. When selecting a region to provision a Dynatrace resource, the corresponding Dynatrace environment is provisioned in the same Azure region. This ensures that your data remains within the specified region. Hence, it gives you the power to leverage the power of Dynatrace within the Azure region while complying with the specific data residency regulations and preferences of your organization. 

Monitor activity with Azure Active Directory logs

In the realm of cloud business, early detection of security threats is crucial to safeguarding business operations. Azure Active Directory (Azure AD) activity logs—encompassing audit, sign-in, and provisioning logs—offer organizations essential visibility into the activities taking place within their Azure AD tenant. By monitoring these logs, organizations can gain insights into user and application activities, including user sign-in patterns, application changes, and risk activity detection. This level of visibility empowers organizations to respond swiftly and effectively to potential threats, enabling proactive security measures and minimizing the impact of security incidents on their operations. 

With Azure Native Dynatrace Service, you can route your Azure AD logs to Dynatrace by setting Dynatrace as a destination in Azure AD diagnostic settings.  

Committed to collaboration and integration 

The Azure Native integration for Dynatrace has simplified the process of gaining deep insights into workloads. This integration empowers organizations to optimize their resources, enhance application performance, and deliver high availability to their users. Microsoft and Dynatrace remain committed to collaborating and improving the integration to provide a seamless experience for their joint customers. By working together, both companies strive to continually enhance the monitoring and observability capabilities within the Azure ecosystem. 

The product is constantly evolving to deepen the integration, aiming to monitor a wide range of Azure workloads and uplift user convenience throughout the experience. 

Next steps 

Learn more about how to create, deploy, and manage a Dynatrace resource on Azure:

Subscribe to Azure Native Dynatrace Service from the Azure Marketplace. 

Learn more about how to deploy Dynatrace monitoring resources on Azure with documentation on Dynatrace and Azure integration. 

To discover more about Dynatrace on Azure, visit the Dynatrace documentation. 

Watch Innovate Faster with Azure Native Dynatrace Service to know more about the integration.  

Share additional information about how you use resource and subscription logs to monitor and manage your cloud infrastructure and applications by responding to this survey. 

The post Deploy a holistic view of your workload with Azure Native Dynatrace Service appeared first on Azure Blog.
Quelle: Azure

Removing barriers to autonomous vehicle adoption with Microsoft Azure

Autonomous vehicles, also known as self-driving cars, have the potential to truly revolutionize the transportation industry, with its impact anticipated across many industries. Several stubborn obstacles, however, stand in the way of mass adoption.

In the over 150 years since the automotive industry was founded, it has never experienced such rapid innovation and transformational change as it is currently experiencing. Since the advent of the horseless carriage in the 1860s, vehicle manufacturers have continued to improve the quality, safety, speed, and comfort of millions of automotive models sold around the world, each year.

Today, however, all eyes are on autonomous vehicles as a cornerstone of future human mobility.

Exponential market growth expected

Over the past decade, the impact of emerging technologies such as AI, machine vision, and high-performance computing (HPC) has changed the face of the automotive industry. Today, nearly every car manufacturer in the world is exploring the potential and power of these technologies to usher in a new age of self-driving vehicles. Microsoft Azure HPC and Azure AI infrastructure are tools to help accomplish that.

Data suggests that the global autonomous vehicle market, with level two autonomous features present in cars, was worth USD76 billion in 2020, but is expected to grow exponentially over the coming years to reach over USD2.1 trillion by 2030, as levels of autonomy features in cars continue to increase.1

The platformization of autonomous taxis also holds enormous potential for the broader adoption and usage of autonomous vehicles. Companies like Tesla, Waymo, NVIDIA, and Zoox are all investing in the emerging category of driverless transportation that leverages powerful AI and HPC capabilities to transform the concept of human mobility. However, several challenges still need to be overcome for autonomous vehicles to reach their potential and become the de facto option for car buyers, passengers, and commuters.

Common challenges persist

One of the most important challenges with autonomous vehicles is ethics. If the vehicle determines what action to take during a trip, how does it decide what holds the most value during an emergency? To illustrate, if an autonomous vehicle is traveling down a road and two pedestrians suddenly run across the road from opposite directions, what are the ethics underpinning whether the vehicle swerves to collide with one pedestrian instead of another?

Another of the top challenges with autonomous vehicles is that the AI algorithms underpinning the technology are continuously learning and evolving. Autonomous vehicle AI software relies heavily on deep neural networks, with a machine learning algorithm tracking on-road objects as well as road signs and traffic signals, allowing the vehicle to ‘see’ and respond to—for example, a red traffic light.

Where the tech still needs some refinement is with the more subtle cues that motorists are instinctually aware of. For example, a slightly raised hand by a pedestrian may indicate they are about to cross the road. A human will see and understand the cue far better than an AI algorithm does, at least for now.

Another challenge is whether there is sufficient technology and connectivity infrastructure for autonomous vehicles to offer the optimal benefit of their value proposition to passengers, especially in developing countries. With car journeys from A to B evolving into experiences, people will likely want to interact with their cars based on their personal technology preferences, linked to tools from leading technology providers. In addition, autonomous vehicles will also need to connect to the world around them to guarantee safety and comfort to their passengers.

As such, connectivity will be integral to the mass adoption of autonomous vehicles. And with the advent and growing adoption of 5G, it may improve connectivity and enable communication between autonomous vehicles—which could enhance autonomous vehicles’ safety and functioning.

Road safety is not the only concern with autonomous vehicles. Autonomous vehicles will be designed to be hyper-connected, almost like an ultra-high-tech network of smartphones on wheels. However, an autonomous vehicle must be precisely that—standalone autonomous. If connectivity is lost, the autonomous vehicle must still be able to operate fully autonomously.

That being said, there is still the risk that cyberattacks could pose a threat to autonomous vehicle motorists, compared to legacy vehicles currently on the road. In the wake of a successful cyberattack, threat actors may gain access to sensitive personal information or even gain control over key vehicle systems. Manufacturers and software providers will need to take every step necessary to protect their vehicles and systems from compromise.

Lastly, there are also social and cultural barriers to the mainstreaming of autonomous vehicles with many people across the globe still very uncomfortable with the idea of giving up control of their cars to a machine. Once consumers can experience autonomous drives and see how the technology continuously monitors a complete 360-degree view around the vehicle and does not get drowsy or distracted, confidence that autonomous vehicles are safe and secure will grow, and adoption rates will rise.

The future of travel is (nearly) upon us

As the world moves closer to a future where autonomous vehicles are a ubiquitous presence on our roads, the complex challenges that must be addressed to make this a safe and viable option become ever more apparent. The adoption of autonomous vehicles is not simply a matter of developing the technology, but also requires a complete overhaul of how we approach transportation systems and infrastructure.

To tackle the many challenges posed by autonomous vehicle adoption, companies and researchers are heavily investing resources into solving these complex challenges. For example, one way that researchers are addressing the ethical challenges posed by autonomous vehicles being able to make life or death decisions, is by developing ethical frameworks that guide the decision-making processes of these vehicles.

These frameworks define the principles and values that should be considered when autonomous vehicles encounter ethical dilemmas, such as deciding between protecting the safety of passengers versus that of pedestrians. Such frameworks can help ensure that autonomous vehicles make ethical decisions that are consistent with societal values and moral principles.

Significant investments are also being made into updating existing infrastructure to accommodate autonomous vehicles. Roads, highways, and parking areas must be equipped with the necessary infrastructure to support autonomous vehicles, such as sensors, cameras, and communication systems.

Companies are also working collaboratively with regulators, researchers, and OEMs to develop policies that ensure that autonomous vehicles can operate safely alongside traditional vehicles. This includes considerations such as how traffic signals, road markings, and signage need to be adapted to support autonomous vehicles.

In 2021, for example, Microsoft teamed up with a market leading self-driving car innovator to unlock the potential of cloud computing for autonomous vehicles, leveraging Microsoft Azure to commercialize autonomous vehicle solutions at scale.

Another global automotive group also recently announced a collaboration with Microsoft to build a dedicated cloud-based platform for its autonomous car systems that are currently in development. This ties in with their ambitious plans to invest more than USD32 billion in the digitalization of the car by 2025.

NVIDIA is also taking bold steps to fuel the growth of the autonomous vehicle market. The NVIDIA DRIVE platform is a full-stack AI compute solution for the automotive industry, scaling from advanced driver-assistance systems for passenger vehicles to fully autonomous robotaxis. The end-to-end solution spans from the cloud to the car, enabling AI training and simulation in the data centre, in addition to running deep neural networks in the vehicle for safe and secure operations. The platform is being utilized by hundreds of companies in the industry, from leading automakers to new energy vehicle makers.

Key takeaways

There is little doubt that the future of human mobility is built upon the ground-breaking innovation and technological capabilities of autonomous vehicles. While some challenges still exist, the underlying technology continues to mature and improve, paving the way for an increase in the adoption of self-driving cars long term.

The technology may soon proliferate and displace other, less safe modes of transport, with huge potential upsides for many aspects of our daily lives, such as saving lives and reducing the number of accidents, decreasing commute times, optimizing traffic flow and patterns, thereby lessening congestion, and extending the freedom of mobility for all.

With vehicle manufacturers and software firms continuously iterating on autonomous vehicle technology, continuing to educate the public on their benefits and continuing to work with lawmakers to overcome regulatory hurdles, we may all soon enjoy a new world, one where technology gets us safely from one destination to another, leaving us free to simply enjoy the view.

Learn more

Get started with Azure HPC and Azure AI infrastructure today or request an Azure HPC demo.

Learn more about Azure AI infrastructure for manufacturing:

AI-first infrastructure for Smart Manufacturing.

How AI and the Cloud are transforming computational engineering in manufacturing and consumer packaged goods.

Powering AI innovations with Azure AI infrastructure.

1https://www.alliedmarketresearch.com/autonomous-vehicle-market
The post Removing barriers to autonomous vehicle adoption with Microsoft Azure appeared first on Azure Blog.
Quelle: Azure

Azure OpenAI Service: 10 ways generative AI is transforming businesses

Technology is advancing at an unprecedented pace, and businesses are seeking innovative ways to maintain a competitive edge. Nowhere is this truer than in the realms of generative AI. From generating realistic images and videos to enhancing customer experiences, generative AI has proven to be a versatile tool across various industries. In this article, we explore 10 ways businesses are utilizing this game-changing technology to transform their operations and drive growth.

Content creation and design: Effective content creation and design are crucial for attracting and engaging customers.Generative AI enables businesses to create visually appealing and impactful content quickly and efficiently, helping them stand out in a crowded marketplace. Generative AI has revolutionized content creation by generating high-quality images, videos, and graphics. From designing logos and product visuals to creating engaging social media content, businesses are using generative AI algorithms to automate the creative process—saving time and resources.The company Typeface ingests information about the brand, including style guidelines, images, and product details. Then, with just a few clicks, customers can generate an assortment of suggested images and text—pre-defined in templates for different use cases—that employees can select and customize for use in an online campaign, marketing email, blog post, or anywhere the company wants to use it.

Accelerated automation: Automating IT tasks improves employee experiences, enhances customer interactions, and drives more efficiency within a company’s developer community.Providing employees with reliable automated support leads to increased efficiency, improved work life, and reduced operational costs.AT&T is using Azure OpenAI Service to enable IT professionals to request resources like additional virtual machines; migrate legacy code into modern code; and empower employees to complete common human resources tasks, such as changing withholdings, adding a dependent to an insurance plan, or requisitioning a computer for a new hire.

Personalized marketing: Personalization increases the chances of customer engagement and conversion and can significantly improve marketing ROI.Generative AI enables businesses to deliver hyper-personalized marketing campaigns. By analyzing customer data, generative algorithms can create dynamic content tailored to an individual’s preferences—optimizing engagement and conversion rates.Through the Take Blip platform and Azure OpenAI Service, brands can have one-on-one conversations that include an infinite flow of interactions with each customer. Interactions are digitized: customers’ requests, intentions, and desires can be recorded and used to tune the platform, making future interactions much more productive.

Chatbots and virtual assistants: Chatbots and virtual assistants powered by generative AI provide instant and accurate responses to customer queries.These intelligent systems can understand and respond to customer queries, provide recommendations, and offer personalized support—enhancing customer service, reducing wait times, improving operational efficiency, and boosting customer satisfaction and loyalty.By using a common chatbot framework along with the Azure Bot Services, Johnson & Johnson employees without technical training can now build their own bots to serve their teams and customers at a fraction of the time and cost it took to develop previous chatbot projects.

Product and service innovation: Staying innovative and meeting evolving customer demands is essential for business success.Local reporters used to be specialists—they focused their time on investigation and writing. Today, they need to be generalists who can create both written and video content and who knows how to maximize viewership on Facebook, Instagram, TikTok, YouTube, and potentially many other distribution channels.”Nota has used Microsoft Azure OpenAI Service to build two AI-assisted tools—SUM and VID. These tools do a lot of the heavy lifting needed to optimize stories for distribution and turn written pieces into engaging videos that can produce up to 10 times as much revenue as written pieces.

Language translation and natural language processing: In a globalized world, language barriers can hinder communication and business growth.Generative AI has improved language translation and natural language processing capabilities. Businesses can use generative models to accurately translate content in real time, enabling seamless communication across borders and bridging language barriers.Microsoft Azure AI services augment HelloTalk’s AI learning tools and technical capabilities, allowing users to connect with the world through language and culture exchange.

Fraud detection and cybersecurity: Businesses face constant threats from fraudsters and cyberattacks.By analyzing patterns and anomalies in large datasets, businesses can leverage generative models to detect and prevent fraud, safeguard sensitive information, and protect their digital assets.Using federated learning techniques along with Azure Machine Learning and Azure confidential computing, Swift and Microsoft are building an anomaly detection model for transactional data—all without copying or moving data from secure locations.

Predictive analytics and forecasting: Accurate predictions and forecasting are vital for effective decision-making and operational efficiency.Generative AI models excel in predictive analytics and forecasting. By analyzing historical data and identifying patterns, businesses can leverage generative algorithms to make accurate predictions and informed decisions, optimizing supply chain management, inventory forecasting, and demand planning.Azure IoT helps Husky meet their system performance needs and maintain service levels for their customers. It scales quickly as they onboard new Advantage+Elite customers and reduces the time and resources spent on infrastructure maintenance.

Creative writing and content generation: Content generation can be time-consuming and resource-intensive. Generative AI algorithms automate the content creation process, allowing businesses to generate articles, blog posts, and other written materials quickly. This technology assists content creators and ensures a consistent flow of fresh and engaging content for audiences.Generative AI algorithms automate the content creation process, allowing businesses to generate articles, blog posts, and other written materials quickly. This technology assists content creators and ensures a consistent flow of fresh and engaging content for audiences. Businesses and content creators can use these models to generate articles, blog posts, advertising copy, and more—saving time for content creators and providing fresh content to engage audiences.With Azure OpenAI Service, CarMax is creating content for its website much more efficiently, freeing up its editorial staff to focus on producing strategic, longer-form pieces that require more insight. Letting Azure OpenAI Service take care of data-heavy summarization tasks gives them time to be more creative and feel more fulfilled.

Medical research and diagnosis: The healthcare industry can benefit from quickly diagnosing diseases—potentially leading to faster and more accurate diagnoses—improving patient outcomes.Researchers can utilize generative models to analyze medical images, detect abnormalities, and aid in the development of new treatments. Additionally, generative AI algorithms can assist in diagnosing diseases by analyzing patient symptoms and medical records, potentially leading to more accurate and timely diagnoses.At Cambridgeshire and Peterborough NHS Foundation Trust, a single patient’s case notes could have up to 2,000 documents. In the past, if you needed information that was stored 1,600 documents ago, you weren’t going to find it. Now, using Azure Cognitive Search it takes as little as three seconds to search for a keyword across those 2,000 documents to find it.

Each of the 10 ways mentioned above addresses significant challenges and opportunities facing businesses today. Azure OpenAI Service empowers businesses to streamline processes, enhance customer experiences, drive innovation, and make data-driven decisions—resulting in improved efficiency, profitability, and competitiveness. In the case of generative AI, what’s good for business is also good for its customers. By leveraging the power of machine learning and generative algorithms, businesses can improve customer experiences while also gaining a competitive advantage in today’s rapidly evolving digital landscape.

Our commitment to responsible AI

Microsoft has a layered approach for generative models, guided by Microsoft’s responsible AI principles. In Azure OpenAI Service, an integrated safety system provides protection from undesirable inputs and outputs and monitors for misuse. In addition, Microsoft provides guidance and best practices for customers to responsibly build applications using these models and expects customers to comply with the Azure OpenAI code of conduct. With GPT-4, new research advances from OpenAI have enabled an additional layer of protection. Guided by human feedback, safety is built directly into the GPT-4 model, which enables the model to be more effective at handling harmful inputs, thereby reducing the likelihood that the model will generate a harmful response. 

Get started with Azure OpenAI Service 

Apply for access to Azure OpenAI Service by completing this form. 

Learn about Azure OpenAI Service and the latest enhancements. 

Get started with GPT-4 in Azure OpenAI Service in Microsoft Learn. 

Read our Partner announcement blog, Empowering partners to develop AI-powered apps and experiences with ChatGPT in Azure OpenAI Service. 

Learn how to use the new Chat Completions API (preview) and model versions for ChatGPT and GPT-4 models in Azure OpenAI Service. 

The post Azure OpenAI Service: 10 ways generative AI is transforming businesses appeared first on Azure Blog.
Quelle: Azure