Learn how to tackle supply chain disruptions with SAP IBP and Google Cloud

Responding to multiple, simultaneous disruptive forces has become a daily routine for most demand planners. To effectively forecast demand, they need to be able to predict the unpredictable while accounting for diverse and sometimes competing factors, including:Labor and materials shortagesGlobal health crisesShifting cross-border restrictionsUnprecedented weather impactsA deepening focus on sustainability Rising inflationInnovators are looking to improve demand forecast accuracy by incorporating advanced capabilities for AI and data analytics, which also speed up demand planning. According to a McKinsey survey of dozens of supply chain executives, 90% expect to overhaul planning IT within the next five years, and 80% expect to or already use AI and machine learning in planning.Google Cloud and SAP have partnered to help customers navigate these challenges and supply chain disruptions starting with the upstream demand planning process, focusing on improving forecast accuracy and speed through integrated, engineered solutions. The partnership is enabling demand planners who use SAP IBP for Supply Chain in conjunction with Google Cloud services to access a growing repository of third-party contextual data for their forecasting, as well use an AI-driven methodology that streamlines workflows and improves forecast accuracy. Let’s take a closer look at these capabilities.Unify data from SAP software with unique Google data signalsWhen it comes to demand forecasting and planning, the more high-quality and relevant contextual data you use, the better, because it helps you understand the influencing factors of your product sales to sense trends and react to disruptions or capitalize on market opportunities more timely and accurately.The expanded Google Cloud and SAP partnership helps customers who use SAP® Integrated Business Planning for Supply Chain (SAP IBP for Supply Chain) bring public and commercial data sets that Google Cloud offers into their own instances of SAP IBP  and include them in their demand planning models in SAP IBP. So, in addition to sales history, promotions, stakeholder inputs and customer data that are typically in SAP IBP, a demand planner can incorporate their advertising performance, online search, consumer trends, community health data, and many more data signals from Google Cloud when working through demand scenarios.More data enables more robust and accurate planning, so Google continues to build an ecosystem of data providers and grow the number of available data sets on Google Cloud. Some current providers include the U.S. Census Bureau, the National Oceanic and Atmospheric Administration, and Google Earth, and partnerships are underway with Crux, Climate Engine, Craft, and Dun & Bradstreet to help companies identify and mitigate risk and build resilient supply chains.Augmenting demand planning with additional external causal factor data is a starting point to drive more accurate forecasting. For example, knowing what regional events may be happening, or the weather patterns that may impact sales of your products, allows you to react faster to these changes by making sure adequate supply is being provided. The result is a more accurate overall plan that reduces resource waste and out-of-stock events. Planners can respond with more accurate and granular daily predictions about sales, pricing, sourcing, production, inventory, logistics, marketing, advertising, and more based on the expanded data.Get more accurate forecasts with Google AI inside Extending the already expansive algorithm selection available in SAP IBP, the release of version 2205 allows SAP IBP customers to access Google Cloud’s supply chain forecasting engine, which is built on Vertex AI — Google Cloud’s AI-as-a-platform offering — from within SAP IBP as part of their forecasting process. The benefit of using an AI-driven engine for demand forecasting is that it meaningfully improves forecast accuracy. Most demand forecasting today is done through a manually set, rules-based model versus an AI-driven model that is smarter and gets better at predicting demand as it works. Take the fastest path from data to value with streamlined workflowsVertex AI can include relevant contextual data sets for demand planning, and the results can be shown in SAP IBP for planners to incorporate when building their workflows.In addition to more accurate forecasts, planners can work faster and more efficiently as they build potential scenarios, meaning they can do more simulations than they do now so that a wider range of disruptions can be modeled. Customers of SAP IBP don’t have to do any of the heavy lifting. They just have to share their data from SAP IBP with Google, then access the process workflow capabilities to set up automated workflows that use the combined data. Google makes the data available so that planners can use it as they’re setting up their workflows in Vertex AI. Users of the Google Supply Chain twinand SAP IBP can combine the rich planning data from IBP with additional SAP data and other Google data sources to provide better supply chain visibility. The  Google Supply Chain twin is a real-time digital representation of your supply chain based on sales history, open customer orders, past and future promotions, pricing and competitor insights, consumer history signals, external data signals and Google data.Leverage Google data signals with SAP IBP for more accurate forecastsIt’s not difficult to access these new capabilities, and the benefits are more accurate near-term forecasts and more return on your investments in SAP IBP and Google Cloud. If you happen to be at the Gartner Supply Chain Symposium from June 6-8th in Orlando, Florida, stop by our booth to say hello. Or, get started nowRelated ArticleHow Google Cloud and SAP solve big problems for big companiesOn the occasion of SAP Sapphire, here’s a rundown of the key Google Cloud and SAP initiatives we’ll be talking to customers about at the …Read Article
Quelle: Google Cloud Platform

Start skilling on Azure with these helpful guides

We are excited to introduce Azure Skills Navigator, a new learning resource designed especially for those that are new to Azure and want to learn more. Azure Skills Navigator is our very own ramp-up guide intended to help you develop a strong foundation on cloud technologies as you begin to explore Azure.

These downloadable Azure Skills Navigator guides offer a variety of resources to help build your skills and knowledge of Azure. Each guide features carefully selected digital training, learning courses, videos, documents, certifications, and more. We understand how important it is in today’s market to stay ahead of the tech curve. There is a high demand for professionals skilled in cloud technologies. Azure Skills Navigator guides ensure that you have a solid foundation as you begin exploring Azure. We have hand-picked a selection of resources that will help you develop a strong foundation of Microsoft Azure, allowing you to build and explore today. After you’ve mastered the content, we will help you navigate our intermediate and advanced level content.

We have guides tailored for a number of roles—System Administrators, Solution Architects, Developers, Data Engineers, and Data Scientists. Given the high demand for these guides, we will be launching more for a number of new roles in the coming months. These role-based guides map out your itinerary for deepening your knowledge of Azure, helping you build a strong foundation for cloud computing in a way that is tailored and personalized for you. You can travel at your own pace, and then continue your Azure exploration with ongoing learning resources ranging from blog updates, videos, and events to connect with technical communities. These guides are just the beginning; Microsoft Learn will be your trusted partner as you progress through your learning journey. There are numerous options for continuing your training and certification beyond these guides as well.

Explore the guides by role below to get started

Azure Skills Navigator for System Administrators: A guide for deepening your knowledge of fundamental concepts of cloud computing and Azure core infrastructure services, management, monitoring, security, and compliance.
Azure Skills Navigator for Solution Architects: A guide for deepening your knowledge of fundamental concepts of Microsoft Azure, core solutions, solution design principles, including security and compliance, and deployment tools and methods to help bring your solution architectures to life.
Azure Skills Navigator for Developers: A guide to build your skills around knowing how to architect and deploy apps in the Cloud and how to maintain and instrument those apps once deployed. Our guide provides an overview of key concepts across Java, .NET, Node.js, and Python, crucial topics to establishing a strong foundation on Microsoft Azure.
Azure AI Learning Journey for Developers: A guide to achieving artificial intelligence expertise on Azure AI, creating the next generation of applications, and preparing for Azure AI Fundamental certification.
Azure Data Engineer Learning Journey for Data Engineers: A guide to achieving expertise in data engineering; explore how Azure Synapse enables you to leverage all your data to unlock powerful insights.
Azure Data Scientist Learning Journey for Data Scientists: A guide to achieving Machine Learning expertise on Azure; learn how to collaborate and build models faster with the latest machine learning tools and frameworks.

Learn more

On-demand Intro to Tech session at Microsoft Build: The New Developer’s Guide to the Cloud hosted by Christoffer Noring, Nitya Narasimhan, and Someleze Diko.
GitHub repo containing all the resources and space for you to share suggestions for improvement.
Blog announcement for Azure Infrastructure guides.
Blog announcement for developers on Azure.
Blog announcement for Azure Data and AI.

Quelle: Azure

From Edge to Mainstream: Scaling to 100K+ IoT Devices

Developers are builders at heart. Many have also ventured into the IoT — an evolving space ruled by microcontrollers, sensors, and various other software-driven microelectronics. Accordingly, the Raspberry Pi has become a favorite “command center” for developers running simple applications. We’ll discuss how these applications have steadily grown more dynamic, and how containerized deployments have solved multi-industry complexities. You’ll also learn what exciting possibilities await for Pi-driven IoT projects.
 
Image courtesy of Harrison Broadbent, via Unsplash.
 
Unlocking Sophisticated Deployments
These Pi devices can support powerful data-driven workloads — like tracking global vaccination rates, or monitoring weather changes (among others). And while running tasks on a few devices is manageable, complexity does grow with scale. This challenge was once pretty mighty across numerous devices.
Marc Pous recently showcased how containers can enable deployments of 100,000+ IoT devices. He also demonstrated that the Raspberry Pi can readily replicate and distribute containers. Consequently, it’s possible — with a little help from your desktop — to pull an image and push containerized services to your IoT fleet.
Luckily, you can also secure a base Raspberry Pi 4 Model B for about $30 USD — though current shortages might make it tricky. Nonetheless, the financial barrier to entry is low, and developers like Marc have now unraveled the technical mystery. Technology that was once complicated has become much more accessible to developers.
 

 
 
 
 
 
 
 
 
 
 
 
Marc’s example leveraged the Kerberos Agent (install it yourself here) — a free and open-source project that transforms IoT devices into security cameras. Typically, each Kerberos Agent requires its own separate host. Marc’s streamlined deployment method, alternatively, leveraged multiple Docker containers linked to a single Docker host. Kerberos outlines this process within its documentation.
You can use Docker Desktop to efficiently manage these containers. Installing Desktop equips your machine with core dependencies and enables Docker CLI commands. By entering docker run –name camera -p 80:80 -p 8889:8889 -d kerberos/kerberos, Marc pulled the Kerberos official image from Docker Hub and quickly spun up his container.
Did you know? Docker Desktop 4.9 recently added powerful new Container interface enhancements. Be sure to check it out!Tooltip Text
From there, he described how to harmoniously use Balena Cloud, Docker Compose, and more to scale your IoT deployment. You can do this from any machine. This solution also works with numerous device types, brands, and deployments. While that use case is interesting, how does it fit into the bigger picture?
Expanding to Other Real-World Applications
IoT applications span countless industries. Manufacturing, agriculture, energy, logistics, healthcare, urban planning, and transportation (among others) rely on IoT devices daily. Each device can generate significant amounts of real-time data. Developers can help companies capture it and uncover insights or form new strategies. Meanwhile, hobbyists can continue to explore exciting new use cases without needing elaborate setups or resources.
Users have purchased over 40 million Raspberry Pi units since launch. Just imagine if even a fraction of those users focused on IoT development. We could see countless, horizontally-scalable solutions emerge.
Over 14.4 billion active IoT endpoints may exist by the end of 2022. Additionally, some expect IoT adoption to surpass 75 billion by 2025. If you’re a developer in this space, you have to feel really good about the possibilities — especially as chipset supplies recover. The capacity needed to scale is, and will be, there.
Investments into “Pi-oT”
Developers from the world’s largest organizations have already embraced Raspberry Pi computing. Back in 2019, Oracle unveiled the world’s largest Pi supercomputer, made using 1,060 clustered Pi boards. NASA’s Jet Propulsion Laboratory (JPL) has also used these boards during Mars missions. If developers entrust this hardware with such critical workloads, then we’ll likely see expanded usage across varied, large-scale deployments.
Massive vendors like AWS have also embraced containerized IoT. In December 2019, AWS IoT Greengrass v1.10 added support for Docker container images. This lets you use Docker to scale to edge devices. AWS still actively maintains Greengrass (now v2+), and has documented how to use IoT-centric Docker containers.
Solving the Puzzle
Overall, IoT uptake is rising quickly. Containers give developers a platform-agnostic way to expand IoT deployments, minus the obstacles from previous years. Marc’s earlier example proved something important: that IoT development and positive developer experiences can go hand-in-hand.
The right tools and workflows are crucial for helping you manage IoT complexity at scale. You can then focus on innovating, instead of grappling with every piece of your deployment. Want to dig deeper? Discover how to harness Docker alongside Microsoft Azure IoT Edge, or learn about containerized deployment via balenaOS.
Quelle: https://blog.docker.com/feed/

Amazon Kendra veröffentlicht GitHub SaaS Connector

Amazon Kendra ist ein intelligenter Suchservice der auf Machine Learning basiert, mit dem Unternehmen ihren Kunden und Mitarbeitern bei Bedarf notwendige Informationen zur Verfügung stellen können. AWS-Kunden können ab sofort den GitHub-SaaS-Connector von Amazon Kendra verwenden, um Dokumente aus der GitHub-Enterprise-Server-Datenquelle zu indizieren und zu durchsuchen.
Quelle: aws.amazon.com

AWS Backup Audit Manager bietet Unterstützung für Amazon S3 und AWS Storage Gateway

AWS Backup Audit Manager ermöglicht Ihnen jetzt die Prüfung und Berichterstattung über die Compliance Ihrer Richtlinien zum Datenschutz für Amazon S3 und AWS Storage Gateway. Mit AWS Backup Audit Manager können Sie jetzt die Backup-Aktivitäten Ihrer Ressourcen von Amazon S3 und AWS Storage Gateway kontinuierlich auswerten. Außerdem können Sie Prüfberichte erstellen, mit denen Sie die Compliance mit bewährten Methoden oder gesetzlichen Vorschriften nachweisen können.
Quelle: aws.amazon.com