Notified team gets smart on MLOps through Advanced Solutions Lab for Machine Learning

Editor’s note: Notified, one of the world’s largest newswire distribution networks, launched a public relations workbench that uses artificial intelligence to help customers pinpoint relevant journalists and expand media coverage. Here’s how they worked with Google Cloud and the Advanced Solutions Lab to train their team on Machine Learning Operations (MLOps).At Notified, we provide a global newswire service for customers to share their press releases and increase media exposure. Our customers can also search our database of journalists and influencers to discover writers who are likely to write relevant stories about their business. To enhance our offering, we wanted to use artificial intelligence (AI) and natural language processing (NLP) to uncover new journalists, articles, and topics—ultimately helping our customers widen their outreach. While our team has expertise in data engineering, product development, and software engineering, this was the first time we deployed an NLP API to be applied to other products. The deployment was new territory, so we needed a solid handle on MLOps to ensure a super responsive experience for our customers. That meant nailing down the process—from ingesting data, to building machine learning (ML) pipelines, and finally deploying an API so our product team could connect their continuous integration/continuous delivery (CI/CD) pipelines. First, I asked around to see how other companies solved this MLOps learning gap. But even at digital-first organizations, the problem hadn’t been addressed in a unified fashion. They may have used tools to support their MLOps, but I couldn’t find a program that trained data scientists and data engineers on the deployment process.Teaming up with Google Cloud to tailor an MLOps curriculumSeeing that disconnect, I envisioned a one-week MLOps hackathon to ramp up my team. I reached out to Google Cloud to see if we could collaborate on an immersive MLOps training. As an AI pioneer, I knew Google would have ML engineers from Advanced Solutions Lab (ASL) who could coach my team to help us build amazing NLP APIs. ASL already had a fully built, deep-dive curriculum on MLOps, so we worked together to tailor our courses and feature a real-world business scenario that would provide my team with the insights they needed for their jobs. That final step of utilization, including deployment and monitoring, was crucial. I didn’t want to just build a predictive model that no one can use. ASL really understood my vision for the hackathon and the outcomes I wanted for my team. They never said it couldn’t be done, we collaborated on a way to build on the existing curriculum, add a pre-training component, and complete it with a hackathon. The process was really smooth because ASL had the MLOps expertise I needed, they understood what I wanted, and they knew the constraints of the format. They were able to flag areas that were likely too intensive for a one-week course, and quickly provided design modules we hadn’t thought to cover. They really were a true part of our team.. In the end—just four months after our initial conversation—we launched our five-week MLOps program. The end product went far beyond my initial hackathon vision to deliver exactly what I wanted, and more.Starting off with the basics: Pre-workThere was so much we wanted to cover in this curriculum that it made sense to have a prerequisite learning plan ahead of our MLOps deep dive training with the ASL team. Through a two-week module, we focused on the basics of data engineering pipelines and ramped up on KubeFlow—an ML toolkit for Kubernetes—as well as NLP and BigQuery, a highly scalable data warehouse on Google Cloud. Getting back in the classroom: MLOps trainingAfter the prerequisite learning was completed, we transitioned into five days of live, virtual training on advanced MLOps with the ASL team. This was a super loaded program, but the instructors were amazing. For this component, we needed to center on real-world use cases that could connect back to our newswire service, making the learning outcomes actionable for our team. We wanted to be extremely mindful of data governance and security so we designed a customized lab based on public datasets. Taking a breather and asking questions: Office hoursAfter nearly three weeks, our team members needed a few days off to absorb all the new information and process everything they had learned. There was a risk of going into the hackathon and being burnt out. Office hours solved that. We gave everyone three days to review what they had learned and get into the right headspace to ace the hackathon. Diving in: Hackathon and deploymentFinally, the hackathon was a chance for our team to implement what they had learned, drill down on our use cases, and actually build a proof of concept–or best-case scenario— working model. Our data scientists built an entity extraction API and a topics API using Natural Language AI to target articles housed in our BigQuery environment. On the data engineering side, we built a pipeline by loading data into BigQuery. We also developed a dashboard that tracks pipeline performance metrics such as records processed and key attribute counts.For our DevOps genius, Donovan Orn, the hackathon was where everything started to click. “After the intensive, instructor-led training, I understood the different stages of MLOps and continuous training, and was ready to start implementing,” Orn said. “The hackathon made a huge difference in my ability to implement MLOps and gave me the opportunity to build a proof of concept. ASL was totally on point with their instruction and, since the training, my team has put a hackathon project into production.”Informing OSU curriculum with a new approach to teaching MLOps The program was such a success that I plan to use the same framework to shape the MLOps curriculum at Oklahoma State University (OSU) where I’m a corporate advisory board member. The format we developed with ASL will inform the way we teach MLOps to students so they can learn the MLOps interactions between data scientists and data engineers that many organizations rely on today. Our OSU students will practice MLOps through real-world scenarios so they can solve actual business problems. And the best part is ASL will lead a tech talk on Vertex AI to help our students put it into practice.Turning our hackathon exercise into a customer-ready serviceIn the end, both my team and Notified customers have benefited from this curriculum. Not only did the team improve their MLOps skills, but they also created two APIs that have already gone into production and significantly augmented the offering we’re delivering to customers. We’ve doubled the number of related articles we’re able to identify and we’re discovering thousands of new journalists or influencers every month. For our customers, that means they can cast a much wider net to share their stories and grow their media coverage. Up next is our API that will pinpoint more reporters and influencers to add to our database of curated journalists.Related ArticleUnlock real-time insights from your Oracle data in BigQueryA tutorial on how to replicate operational data from an Oracle database into BigQuery so that you can keep multiple systems in sync real-…Read Article
Quelle: Google Cloud Platform

Get value from data quickly with Informatica Data Loader for BigQuery

If data is currency in today’s digital environment, then organizations should waste no time in making sure every business user has fast access to data-driven insights.  Informatica and Google Cloud are working together to make it happen. We’re excited to share that Informatica will provide a free service on Google Cloud called Informatica Data Loader for Google BigQuery, which accelerates data uploads and keeps data flowing so that people can get to the insights and answers they need faster. The company made the announcement at Informatica World, on May 24, 2022, describing Informatica Data Loader as a tool to mitigate lengthy data upload times and associated high costs —  challenges that are only growing as organizations ingest more data from more sources.Maintaining a healthy data pipeline from multiple platforms, applications, services, and other sources requires more work as the number of sources grows. But with Informatica Data Loader, companies can quickly ingest data for free from over 30 common sources into their Google BigQuery cloud data warehouse while Informatica technology automates pipeline ingestion on the back end. This shortens time-to-value for data projects from what could be weeks or months to just minutes, and it frees people up for more strategic data work. The Informatica Data Loader empowers Google Cloud customers to: Centralize disparate data sources on BigQuery for better visibility into data resources and faster delivery to whoever needs the dataQuickly load data into BigQuery in only three steps, with zero setup, zero code, and zero costOperationalize data pipelines with the power, performance, and scale of Informatica’s Intelligent Data Management Cloud (IDMC) at no costReduce maintenance resource requirements by eliminating the need to fix broken pipelines and keep up with changing APIsAllow non-technical users across the organization to easily access, manage, and analyze dataInformatica partnership streamlines data transformationThis isn’t the first time Google Cloud has partnered with Informatica to help customers get the most value from their data. Google Cloud-validated connectors from Informatica help customers streamline data transformations and quickly move data from any SaaS application, on-premises database, or big data source into Google BigQuery. ​​Our partnership has helped hundreds of customers on Google Cloud.“Data is fundamental to digital transformation, and we partner closely with Informatica to make it very easy for enterprises to bring their data from across platforms and environments into the cloud,” said Gerrit Kazmaier, VP and GM of Databases, Data Analytics, and Business Intelligence at Google Cloud. “The launch of Informatica Data Loader will further simplify the path for customers to bring data into BigQuery for analysis, and accelerate their data-driven business transformations.”According to Informatica, Data Loader is the industry’s first zero-cost, zero-code, zero-DevOps, zero-infrastructure-required cloud data management SaaS offering for departmental users. Google Cloud customers can access Informatica Data Loader directly from the Google BigQuery console and ingest data from dozens of common sources, including MongoDB, ServiceNow, Oracle SQL Server, NetSuite, Microsoft SharePoint, and more. The Informatica IDMC solution is available in the Google Cloud Marketplace, but Informatica is making Informatica Data Loader available to all Google BigQuery customers, whether they use IDMC or not. Informatica Data Loader shares a common AI-powered metadata intelligence and automation layer with IDMC, but companies can subscribe to each use case individually. “Expanding our strategic partnership with Google Cloud beyond enterprise cloud data management to offer free, fast, and frictionless  data loading to all Google customers represents a new chapter in our partnership and brings the power of IDMC to everyone,” said Jitesh Ghai, Informatica’s Chief Product Officer. “With the launch of Informatica Data Loader for Google BigQuery, we are enabling every organization to put the power of their data in the hands of business users so they can move from data ingestion to insights at a speed never before possible.”Learn more about the Informatica Data Loader for Google BigQuery here.Related ArticleSecurely exchange data and analytics assets at scale with Analytics Hub, now available in previewEfficiently and securely exchange valuable data and analytics assets across organizational boundaries with Analytics Hub. Start your free…Read Article
Quelle: Google Cloud Platform

Google Cloud Data Heroes Series: Meet Antonio, a Data Engineer from Lima, Peru

Google Cloud Data Heroes is a series where we share stories of the everyday heroes who use our data analytics tools to do incredible things. Like any good superhero tale, we explore our Google Cloud Data Heroes’ origin stories, how they moved from data chaos to a data-driven environment, what projects and challenges they are overcoming now, and how they give back to the community.In this month’s edition, we’re pleased to introduce Antonio! Antonio is from Lima, Peru and works as a full time Lead Data Engineer at Intercorp Retail and a Co-founder of Datapath. He’s also a part time data teacher, data writer, and all around data enthusiast. Outside of his allegiance to data, Antonio is a big fan of the Marvel world and will take any chance to read original comic books and collect Marvel souvenirs. He’s also an avid traveler and enjoys the experience of reliving family memories through travel. Antonio is proudly pictured here atop a mountain in Cayna, Peru, where all of his grandparents lived.When were you introduced to Google Cloud and how did it impact your career? In 2016, I applied for a Big Data diploma at the Universidad Complutense del Madrid, where I had my first experience with cloud. That diploma opened my eyes to a new world of technology and allowed me to get my first job as a Data Engineer at Banco de Crédito del Perú (BCP), the largest bank and the largest supplier of integrated financial services in Perú and the first company in Peru using Big Data technologies. At BCP, I developed pipelines using Apache Hadoop, Apache Spark and Apache Hive in an on-premise platform. In 2018, while I was teaching Big Data classes at the Universidad Nacional de Ingeniería, I realized that topics like deploying a cluster in a traditional PC were difficult for my students to learn without their own hands-on experience. At the time, only Google Cloud offered free credits, which was fantastic for my students because they could start learning and using cloud tools without worrying about costs.In 2019, I wanted a change in my career and left on-prem technologies to specialize in cloud technologies. After many hours of study and practice, I got the Associate Cloud Engineer certification at almost the same time I applied for a Data Engineer position at Intercorp, where I would need to use GCP data products. This new job pushed me to build my knowledge and skills on GCP and matched what I was looking for. Months later, I obtained the Professional Data Engineer certification. That certification, combined with good performance at work, allowed me to get a promotion to Data Architect in 2021. In 2022, I have started in the role of Lead Data Engineer.How have you given back to your community with your Google Cloud learnings?To give back to the community, once a year, I organize a day-long conference called Data Day at Universidad Nacional Mayor de San Marcos where I talk about data trends, give advice to college students, and call for more people to find careers in cloud. I encourage anyone willing to learn and I have received positive comments from people from India and Latin America. Another way I give back is by writing articles sharing my work experiences and publishing them on sites like Towards Data Science, Airflow Community and the Google Cloud Community Blog. Can you highlight one of your favorite projects you’ve done with GCP’s data products?At Intercorp Retail, the digital marketing team wanted to increase online sales by giving recommendations to users. This required the Data & Analytics team to build a solution to publish product recommendations related to an item a customer is viewing on a web page. To achieve this, we built an architecture that looks like the following diagram.We had several challenges. First, finding a backend that supports millions of requests per month. So after some research, we decided to go with Cloud Run because of the ease of development and deployment. The second decision was to define a database for the backend. Since we needed a database that responds in milliseconds, we chose Firestore.Finally, we needed to record all the requests made to our API to identify any errors or bad responses. In this scenario, Pub/Sub and Dataflow allowed us to do it in a simple way without worrying about scaling. After two months, we were ready to see it on a real website (see below). For future technical improvements we’re considering using Apigee as our API proxy to gather all the requests and take them to the correct endpoint. Cloud Build will be our alternative to our deployment process.What’s next for you and what do you hope people will take away from your data hero story? Thanks to the savings that I’ve collected while working in the past five years, I recently bought a house in Alabama. For me, this was a big challenge because I have only lived and worked outside of the United States. In the future, I hope to combine my data knowledge with the real estate world and build a startup to facilitate the home buying process for Latin American people.I’ll also focus on gaining more hands-on experience in data products, and giving back to my community through articles and soon, videos. I dream one day to present a successful case of my work in a big conference like the Google Cloud Next.If you are reading this and you are interested in the world of data and cloud, you just need an internet connection and some invested effort to kickstart your career. Even if you are starting from scratch and are from a developing country like me, believe that it is possible to be successful. Enjoy the journey and you’ll meet fantastic people along the way. Keep learning just like you have to exercise to keep yourself in shape. Finally, if there is anything that I could help you with just send me a message and I would be happy to give you any advice.Begin your own Data Hero journeyReady to embark on your Google Cloud data adventure? Begin your own hero’s journey with GCP’s recommended learning path where you can achieve badges and certifications along the way. Join the Cloud Innovators program today to stay up to date on more data practitioner tips, tricks, and events.If you think you have a good Data Hero story worth sharing, please let us know! We’d love to feature you in our series as well.Related ArticleGoogle Cloud Data Heroes Series: Meet Lynn, a cloud architect equipping bioinformatic researchers with genomic-scale data pipelines on GCPGoogle Cloud introduces their Data Hero series with a profile on Lynn Langit, a data cloud architect, educator, and developer on GCPRead Article
Quelle: Google Cloud Platform

Introducing the Microsoft Intelligent Data Platform

We are moving to a world where every application needs to be intelligent and adaptive to real-time model learning. As businesses build modern data capabilities, they must make decisions at the speed of human thought. Developers are challenged by this, given the huge silos that exist between databases and analytic products, and the complexity of a fragmented data estate can hamper the speed of agility and innovation. Data engineers, data scientists, and business analysts struggle with the complexity of making data integration, data warehousing, machine learning operations (MLOps), and business intelligence (BI) work together. What is needed is a consistent data ecosystem.

To help address the fragmentation that exists today between databases, analytics and governance, and enable organizations to unlock these new capabilities, we shared several exciting announcements today at Microsoft Build that demonstrate our continued innovation and investment in the data products our customers have come to know and trust, which will enable our customers to achieve the kind of sustained agility that allows them to pivot and adapt in real-time, add layers of intelligence to their applications, unlock fast and predictive insights, and govern their data—wherever it resides.

Accelerate innovation

Today, we unveiled the new Microsoft Intelligent Data Platform, the leading cloud data platform that fully integrates databases, analytics, and governance. This seamless data platform empowers organizations to invest more time creating value rather than integrating and managing their data estate.

Furthering our mission of integration, Azure Synapse Link for SQL removes data movement barriers, providing a seamless data pipeline to Azure Synapse Analytics, and enables near-real-time analytics for SQL Server 2022 and Azure SQL Database. Once Azure Synapse Link transfers data to Azure Synapse Analytics, data can be used for advanced analytics with no performance impact on transactional workloads. Over 10 million Azure SQL databases globally can now leverage this capability.

Achieve agility

Through continued investments in databases and analytics, we are empowering customers to achieve agility in new ways.

Now in preview SQL Server 2022 is our most Azure-enabled release yet, with continued innovation across performance, security, and availability. By connecting SQL server to Azure through seamless disaster recovery to Azure SQL Managed Instance, SQL Server 2022 provides true resilience. The latest Azure Arc innovation is here with the Business Critical tier of Azure Arc-enabled SQL Managed Instance now generally available—helping customers run the most demanding mission-critical workloads in hybrid and multicloud environments. We continue to invest in bringing the best developer experience in any cloud with Azure SQL Database releasing new features to help simplify and expedite application development and reduce time to market for developers. In addition, the ledger feature in Azure SQL Database, now generally available, eliminates the additional cost and complexity of decentralized blockchain technology while providing the benefits of blockchain in a fully managed and familiar SQL environment.

New innovation in cloud-native NoSQL and open-source databases give developers the freedom to build on their terms. Azure Cosmos DB has an enhanced 30-day trial experience now generally available and has introduced new burst capacity and elasticity features in preview. Application traffic spikes don’t have to equal spikes in costs. New burst capacity and elasticity features ensure applications deliver high performance during peak times while still remaining cost-effective. The Azure Database for MySQL Flexible Server Memory-Optimized service tier is now the improved "Business Critical" tier for high performance transactional or analytical applications with a 1.5x performance improvement over Single Server with faster failover time to standby.

When it comes to analytics, Azure Synapse is simply unmatched with more meaningful integrations that enable existing Synapse customers to get even more value from their Microsoft 365 data. Microsoft Graph Data Connect empowers customers to securely export their Microsoft 365 data estate, and it’s only available on Microsoft Azure. This enables every customer to unlock new actionable business insights with the employee and customer collaboration data that comes from Microsoft 365.

Customers like KPMG use Microsoft Intelligent Data Platform integrations to power their KPMG Digital Gateway bringing together a wealth of tools to help customers tackle regulatory change, turn data into value, and streamline compliance and planning while enabling effective collaboration across tax, legal, and finance departments. KPMG Digital Gateway puts its investments in machine learning, data analytics, powerful visualizations, and AI technologies all in one place. Client data is provided once and leveraged across applications, saving time and money.

To continue creating simple frictionless experiences, we announced the preview of datamart in Power BI, a new Power BI Premium self-service capability that enables users to uncover actionable insights through their own data sets. This out-of-the-box feature empowers developers and business analysts to build a datamart that can be centrally governed and managed for workloads up to half a terabyte—accelerating time to insight while alleviating demands on IT. This new feature brings the power of data warehousing and puts it in the hands of individual Power BI developers and analysts, helping you uncover more insights and drive digital transformation at every level of a business.

Build on a trusted platform

Meeting data privacy and governance standards cannot be an afterthought. When governance is not deeply integrated where data lives, it is nearly impossible to meet regulatory requirements. 

Data governance has become top of mind for almost every organization where data is fluidly moving across hybrid and multicloud environments. This makes it increasingly important to map the lineage of data. Dynamic Lineage for Azure SQL Databases in Microsoft Purview is currently in preview to further enrich the Microsoft Purview Data Map with details from actual runs of SQL stored procedures in Azure SQL Databases for customers to govern their data across hybrid and multicloud environments. We are also excited to announce that the Microsoft Purview Data Estate Insights will be generally available in the coming months.

As we look beyond the horizon, machine learning and AI capabilities will be pivotal in harnessing the power of data in new ways. Azure Machine Learning now offers a Responsible AI dashboard in preview, making it possible for customers to implement it more easily by debugging machine learning models and making informed data-driven decisions. The dashboard brings together capabilities such as data explorer, model interpretability, error analysis, counterfactual, and causal inference analysis in a single view. In addition, Azure Machine Learning now offers a Responsible AI scorecard in preview to summarize model performance and insights so that all stakeholders can easily participate in compliance reviews.

The opportunity to accelerate innovation in your business and achieve agility across all your data is substantial. Now is the time to realize its limitless potential. We look forward to seeing what you can do with it.

Learn more about Azure at Microsoft Build 

Deepen your skills with hand-picked learning modules, documentation, and community content.  
Watch all the sessions you loved, or catch up on any you missed after the event with Build On-Demand. 
Continue learning! Join Microsoft experts through live technical content and events to keep your learning going post-Build.
Read how to build productively, collaborate securely, and scale innovation—no matter where in the world with a comprehensive set of Microsoft developer tools and platform.  
Read the latest on how Azure powers your app innovation and modernization with the choice of control and productivity you need to deploy apps at scale. 
Read how you can Innovate faster and achieve greater agility with the Microsoft Intelligent Data Platform and turn your data into decisions.

Azure. Invent with purpose.
Quelle: Azure

Code, test, and ship your next app quickly and securely with Microsoft developer tools

Welcome to Microsoft Build, the event that’s all about celebrating the developer community! The work you do has the power to transform entire industries and keep critical businesses and services running through innovative solutions and applications. I couldn’t be more honored to champion this entire dev community as you create the future.

Microsoft was founded as a developer company and almost 50 years later, nothing has changed in that regard. From our earliest products to the powerful developer tools available now through Visual Studio, GitHub, and Azure, we keep development teams top of mind. Today our full Microsoft Cloud stack brings an incredible platform for developers to use and build apps and solutions. I like to say, we’re the platform and you bring the innovation.

I take great inspiration from the innovative things you’re already building with our platform and tools. Development teams at companies big and small are choosing Microsoft to code and modernize. Gjensidige, the largest insurance company in Norway, is just one great example—they chose Azure and the combined capabilities of GitHub to modernize applications at enterprise scale. And I’m always amazed by the creativity and ingenuity that student developers are bringing to the platform.

The developer experience today

Whenever I talk to customers and my colleagues who code, a few themes emerge about what development teams need to be successful and how it’s evolved—especially in the last couple of years.

Security shifts left. The world’s reliance on technology continues to grow and so does the importance of security. With cyberattacks on the rise, we want to equip developers with the tools to shift security left, into code, so problems and concerns can be identified and fixed before a security breach even happens.

Collaboration, anywhere. In the age of hybrid work, it matters less where you sit and work. It could be in a home or traditional office setting or even in your local coffee shop. With dispersed development teams using different languages, devices, and networks, developers need to interact and connect with their teams—and have confidence they can deliver their best work from anywhere and that it will be secure.

Agility and productivity. With a hybrid work environment, teams need to stay agile and onboard new team members quickly and efficiently. Ideally, developers can focus on code instead of spending time setting up dev environments and managing infrastructure.

Today, I’m proud to share some news and updates designed to address these needs and improve the overall developer experience even further with our beloved tools and the Microsoft Cloud platform—all designed to help you quickly code and ship from anywhere with confidence.

Modern solutions and apps for hybrid teams

Through GitHub, we’re already delivering the developer environment of the future. GitHub Copilot is an AI programmer that empowers you to write code faster and with less work. Another way the development experience is changing is work moving to the cloud. GitHub Codespaces is a cloud development environment that works great for web app, cloud-native applications, APIs, or backend development. But what if you’re working on a different workload, like desktop, mobile, embedded, or game development? Or if you’re using a different version control system other than GitHub?

Enter Microsoft Dev Box. Microsoft Dev Box is a cloud solution that provides developers with self-service access to high-performance workstations preconfigured and ready-to-code for specific projects. Developers can get started coding quickly, without worrying about security, compliance, or cost control. It is tailored to meet the needs of today’s developers and integrated with Windows 365 so IT administrators can manage Dev Boxes and Cloud PCs together in Microsoft Intune and Microsoft Endpoint Manager. Microsoft Dev Box will soon be in preview.

With Microsoft Dev Box, leads can quickly create projects, configure images, and assign team members so they can get straight to code in seconds, from anywhere. No matter where in the world you’re working, the onboarding process for a new team has never been easier.

Start secure, stay secure

How we all work has forever changed after the last several years. Security is essential in a world of increasing cyberattacks, global uncertainty, and with distributed teams sitting in different locations using different platforms, languages, and devices.

As I mentioned, I’ve heard from many of my developer colleagues about the need to identify and fix problems before they result in a security breach. The push protection feature in GitHub Advanced Security proactively protects against secret leaks. For another layer of security and added peace of mind, today we’re announcing the general availability of GitHub OpenID Connect (OIDC) with Azure AD workload identity federation to minimize the need for storing and accessing secrets. With this integration, developers can manage all cloud resource access securely in Azure.

Increase your app resilience

As you build apps, you need to know the app can handle traffic and scale before it launches. We recently launched a preview of Azure Load Testing, an Azure service to help teams test and meet scale and performance goals with confidence. And since launch, we’ve seen customers of all sizes using the service to simulate high-scale load for apps running anywhere and catch performance bottlenecks early. We’ve heard you loud and clear about some additional features you’d like to see—testing private endpoints, using custom plugins, and more Azure regions. These features and more will be coming soon so be sure to visit Azure Updates frequently.

Start with Microsoft for an ideal developer experience

When you’re ready to take your next idea from code to the cloud, we want you to start with us. The session Rapidly code, test, and ship from secure cloud development environments is where you can learn more about these announcements and how we’re delivering a great developer experience to you—no matter where you’re working from or what you’re using to build. I also encourage you to view the session about Scaling cloud-native apps and accelerating app modernization to learn about what’s new to help you build cloud-native apps.

There’s no shortage of great content throughout the Microsoft Build experience this week; new innovations, technical demos, learning opportunities, and plenty of fun with our After Hours sessions. I encourage you to take it all in and keep sharing with us so we can deliver an even better experience for you.

On behalf of the Azure team here at Microsoft, thanks for inspiring us with what you build.

Learn more about Azure at Microsoft Build

Deepen your skills with hand-picked learning modules, documentation, and community content.
Watch all the sessions you loved, or catch up on any you missed after the event with Build On-Demand.
Continue learning! Join Microsoft experts through live technical content and events to keep your learning going post-Microsoft Build.
Read the latest on how Azure powers your app innovation and modernization with the choice of control and productivity you need to deploy apps at scale. 
Read how you can Innovate faster and achieve greater agility with the Microsoft Intelligent Data Platform and turn your data into decisions.   

Azure. Invent with purpose.
Quelle: Azure

Scale your cloud-native apps and accelerate app modernization with Azure, the best cloud for your apps

Developers are essential to the world we live in today, and the work you do is critical to powering organizations in every industry. Every developer and development team brings new ideas and innovation. Our ambition with the Microsoft Cloud and Azure is to be the platform for all of this innovation to really empower the entire community as they build what comes next.

Microsoft was founded as a developer tools company, and developers remain at the very center of our mission. Today, we have the most used and beloved developer tools with Visual Studio, .NET, and GitHub. We offer a trusted and comprehensive platform to build amazing apps and solutions that help enable people and organizations across the planet to achieve more.

Over 95 percent of the world’s largest companies today are choosing Microsoft Azure to run their business, in addition to thousands of smaller and mid-size innovative organizations as well. The NBA uses Azure and AI capabilities to turn billions of in-game data points into customizable content for its fans. Stonehenge Technology Labs has increased developer velocity through its fast-growing commerce enhancement software, STOPWATCH, using Azure, Live Share, and Visual Studio.

With the Microsoft Cloud and Azure, we meet you where you are and make it easy for you to start your cloud-native journey—from anywhere. That means developers can use their favorite languages, open-source frameworks, and tools to code and deploy to the cloud and the edge, collaborating in a secure way and integrating different components in no time with low-code solutions.

Supporting all of this, here are some of the latest developments we’ll talk about at Microsoft Build this week. You can also view the Scaling cloud-native apps and accelerating app modernization session to learn more about these announcements. 

Build modern, cloud-native apps productively with serverless technologies and the best Kubernetes experience for developers

As new apps are built, you’ll want them to be cloud-native since they’re designed to take full advantage of everything the cloud offers. Using cloud-native design patterns helps achieve the agility, efficiency, and speed of innovation that you need to deliver for your businesses. The experience bar and what end users expect from apps is going up. Product launches, peak shopping seasons, and sporting events are just a few examples of highly dynamic usage demands that modern apps must be prepared to handle.

This is made possible through architectures and technologies like containers, serverless, microservices, APIs, and DevOps everywhere, which offer the shortest path to cloud value. With Azure, GitHub, and the Microsoft Cloud, we’re working to better enable you to easily leverage all these capabilities.

Azure Container Apps offers an ideal platform for application developers who want to run microservices in serverless containers without managing infrastructure. Today, Azure Container Apps is generally available and ready for you to use. It’s built on the strong open-source foundation of the Kubernetes ecosystem, which is core for cloud-native applications.

Azure Kubernetes Services (AKS) was built to be a destination for all developers and provide the best-managed experience for Kubernetes, whether it’s your first time trying it or using it regularly and quickly for testing. It delivers elastic provisioning of capacity without the need to manage underlying compute infrastructure and is the fastest way to spin up managed Kubernetes clusters and configure a seamless DevSecOps workflow with CI/CD integration.

A great example of a customer taking advantage of AKS today is Adobe. Adobe evolved to cloud-native practices a few years ago and adopted a microservices architecture. They chose AKS because of its scalable, flexible, and multi-cloud capabilities, and it brought faster development, from onboarding to production, all while providing automated guardrails with DevSecOps practices.

Today, we have some great updates to enhance the developer and operator experience on AKS even further, making it faster and easier than ever before so you can spend more time writing code. We’re launching the Draft extension and CLI, the preview of a new integrated AKS web application routing add-on, and a KEDA extension (Kubernetes Event-driven Autoscaling extension). 

The power and scalability of a cloud-native platform

What makes the Microsoft Cloud particularly rich as a development platform and ecosystem is the services it delivers and the underlying cloud infrastructure that allows you to focus on writing and shipping code. You can build upon and leverage a complete cloud-native platform, from containers to cloud-native databases and AI services.

Azure Cosmos DB is a fully-managed and serverless developer database—and the only database service in the market to offer service legal agreements (SLAs) guaranteeing single-digit millisecond latency and 99.999 percent availability. These guarantees are available globally at any scale, even through traffic bursts.

Today, we’re improving Azure Cosmos DB elasticity with new burst capacity and increased serverless capacity to 1TB—while only paying for the storage and throughput used. In preview, this ability is ideal for workloads with intermittent and unpredictable traffic and allows developers to build scalable, cost-effective cloud-native applications.

We see customers innovating at a faster pace with cloud-native technologies. Azure Arc brings Azure security and cloud-native services to hybrid and multicloud environments, enabling you to secure and govern infrastructure and apps anywhere.

One example of a customer turning to Azure Arc is Canada’s largest bank, Royal Bank of Canada (RBC). As a Kubernetes-based deployment, Azure Arc enables the company to leverage existing infrastructure investments, and skillsets to manage and automate database deployments. Arc-enabled Data services allowed RBC to accelerate their time to market and development of products—bringing more time and focus to innovation and integration of their products and capabilities.

We continue to innovate and add new capabilities to Azure Arc to enable hybrid and multicloud scenarios. Today, we’re excited to announce several new Azure Arc capabilities including the landing zone accelerator for Azure Arc-enabled Kubernetes, offering customers greater agility for cloud-native apps and tools to simplify hybrid and multicloud deployments—all while strengthening security and compliance. The landing zone accelerator provides best practices, guidance, and automated reference implementations for a fast and easy deployment.

Azure Managed Grafana is part of our approach to provide customers with all the tools they need to manage, monitor, and secure their hybrid and multicloud investments. We recently launched this integration so you can easily deploy Grafana dashboards, complete with Azure’s built-in high availability and security.

I’m excited to share that the Business Critical tier of Azure Arc-enabled SQL Managed Instance is now generally available to meet the most demanding critical business continuity requirements. This allows developers to build scalable, cost-effective cloud-native apps and add the same top-rated security and automated update capabilities they’ve trusted for decades.

Modernize Java applications

Java continues to be one of the most important programming languages, and we’re committed to helping Java developers run their Spring applications more easily in the cloud. As part of a long-time collaboration with Pivotal, now VMware, Azure Spring Cloud was created as a fully managed service for Spring Boot applications to solve the challenges of running Spring at scale. Azure Spring Cloud is a fully-featured platform for all types of Spring applications; to better reflect this, the service is now called Azure Spring Apps.

Azure Spring Apps Enterprise will be generally available in June, bringing fully managed VMware Tanzu components running on Azure and advanced Spring Runtime support. Customers like FedEx are already leveraging this collaboration on Azure Springs Apps to deliver an impactful solution for their end-customers, helping predict estimated delivery times for millions of packages globally.

Build with Microsoft Cloud

Developing with the Microsoft Cloud puts the latest technologies in your hands and empowers you with both control and productivity. It offers a trusted and comprehensive platform so you can build great apps and solutions.

Microsoft Build is all about celebrating the work you do and helping you build what comes next. Be sure to view the session Scaling cloud-native apps and accelerating app modernization to learn more about these announcements. I also encourage you to view the Rapidly code, test, and ship from secure development environments session for more depth on Microsoft’s developer tools. There’s an exciting week planned, so join in throughout the entire digital event for more announcements, customer stories, breakout sessions, learning opportunities, and technical demos. Enjoy the event experience. I can’t wait to see what you build.

Learn more about Azure at Microsoft Build

Deepen your skills with hand-picked learning modules, documentation, and community content.
Watch all the sessions you loved, or catch up on any you missed after the event with Build on-demand.
Continue learning! Join Microsoft experts through live technical content and events to keep your learning going post-Build.
Read how to build productively, collaborate securely, and scale innovation—no matter where in the world with a comprehensive set of Microsoft developer tools and platform.
Read how you can Innovate faster and achieve greater agility with the Microsoft Intelligent Data Platform and turn your data into decisions.

Azure. Invent with purpose.
Quelle: Azure