Responsible AI investments and safeguards for facial recognition

A core priority for the Cognitive Services team is to ensure its AI technology, including facial recognition, is developed and used responsibly. While we have adopted six essential principles to guide our work in AI more broadly, we recognized early on that the unique risks and opportunities posed by facial recognition technology necessitate its own set of guiding principles.

To strengthen our commitment to these principles and set up a stronger foundation for the future, Microsoft is announcing meaningful updates to its Responsible AI Standard, the internal playbook that guides our AI product development and deployment. As part of aligning our products to this new Standard, we have updated our approach to facial recognition including adding a new Limited Access policy, removing AI classifiers of sensitive attributes, and bolstering our investments in fairness and transparency.

Safeguards for responsible use

We continue to provide consistent and clear guidance on the responsible deployment of facial recognition technology and advocate for laws to regulate it, but there is still more we must do.

Effective today, new customers need to apply for access to use facial recognition operations in Azure Face API, Computer Vision, and Video Indexer. Existing customers have one year to apply and receive approval for continued access to the facial recognition services based on their provided use cases. By introducing Limited Access, we add an additional layer of scrutiny to the use and deployment of facial recognition to ensure use of these services aligns with Microsoft’s Responsible AI Standard and contributes to high-value end-user and societal benefit. This includes introducing use case and customer eligibility requirements to gain access to these services. Read about example use cases, and use cases to avoid, here. Starting June 30, 2023, existing customers will no longer be able to access facial recognition capabilities if their facial recognition application has not been approved. Submit an application form for facial and celebrity recognition operations in Face API, Computer Vision, and Azure Video Indexer here, and our team will be in touch via email.

Facial detection capabilities (including detecting blur, exposure, glasses, head pose, landmarks, noise, occlusion, and facial bounding box) will remain generally available and do not require an application.

In another change, we will retire facial analysis capabilities that purport to infer emotional states and identity attributes such as gender, age, smile, facial hair, hair, and makeup. We collaborated with internal and external researchers to understand the limitations and potential benefits of this technology and navigate the tradeoffs. In the case of emotion classification specifically, these efforts raised important questions about privacy, the lack of consensus on a definition of “emotions,” and the inability to generalize the linkage between facial expression and emotional state across use cases, regions, and demographics. API access to capabilities that predict sensitive attributes also opens up a wide range of ways they can be misused—including subjecting people to stereotyping, discrimination, or unfair denial of services.

To mitigate these risks, we have opted to not support a general-purpose system in the Face API that purports to infer emotional states, gender, age, smile, facial hair, hair, and makeup. Detection of these attributes will no longer be available to new customers beginning June 21, 2022, and existing customers have until June 30, 2023, to discontinue use of these attributes before they are retired.

While API access to these attributes will no longer be available to customers for general-purpose use, Microsoft recognizes these capabilities can be valuable when used for a set of controlled accessibility scenarios. Microsoft remains committed to supporting technology for people with disabilities and will continue to use these capabilities in support of this goal by integrating them into applications such as Seeing AI.

Responsible development: improving performance for inclusive AI

In line with Microsoft’s AI principle of fairness and the supporting goals and requirements outlined in the Responsible AI Standard, we are bolstering our investments in fairness and transparency. We are undertaking responsible data collections to identify and mitigate disparities in the performance of the technology across demographic groups and assessing ways to present this information in a way that would be insightful and actionable for our customers.

Given the potential socio-technical risks posed by facial recognition technology, we are looking both within and beyond Microsoft to include the expertise of statisticians, AI/ML fairness experts, and human-computer interaction experts in this effort. We have also consulted with anthropologists to help us deepen our understanding of human facial morphology and ensure that our data collection is reflective of the diversity our customers encounter in their applications.

While this work is underway, and in addition to the safeguards described above, we are providing guidance and tools to empower our customers to deploy this technology responsibly. Microsoft is providing customers with new tools and resources to help evaluate how well the models are performing against their own data and to use the technology to understand limitations in their own deployments. Azure Cognitive Services customers can now take advantage of the open-source Fairlearn package and Microsoft’s Fairness Dashboard to measure the fairness of Microsoft’s facial verification algorithms on their own data—allowing them to identify and address potential fairness issues that could affect different demographic groups before they deploy their technology. We encourage you to contact us with any questions about how to conduct a fairness evaluation with your own data.

We have also updated the transparency documentation with guidance to assist our customers to improve the accuracy and fairness of their systems by incorporating meaningful human review to detect and resolve cases of misidentification or other failures, by providing support to people who believe their results were incorrect, and by identifying and addressing fluctuations in accuracy due to variation in operational conditions.

In working with customers using our Face service, we also realized some errors that were originally attributed to fairness issues were caused by poor image quality. If the image someone submits is too dark or blurry, the model may not be able to match it correctly. We acknowledge that this poor image quality can be unfairly concentrated among demographic groups.

That is why Microsoft is offering customers a new Recognition Quality API that flags problems with lighting, blur, occlusions, or head angle in images submitted for facial verification. Microsoft also offers a reference app that provides real-time suggestions to help users capture higher-quality images that are more likely to yield accurate results.

To leverage the image quality attribute, users need to call the Face Detect API. See the Face QuickStart to test out the API.

Looking to the future

We are excited about the future of Azure AI and what responsibly developed technologies can do for the world. We thank our customers and partners for adopting responsible AI practices and being on the journey with us as we adapt our approach to new responsible AI standards and practices. As we launch the new Limited Access policy for our facial recognition service, in addition to new computer vision features, your feedback will further advance our understanding, practices, and technology for responsible AI.

Learn more at the Limited Access FAQ.
Quelle: Azure

See how 3 industry-leading companies are driving innovation in a new episode of Inside Azure for IT

I had the awesome opportunity to talk with a few people innovating with some of the most exciting next-generation tech in our latest episode of the Inside Azure for IT fireside chat series. Many of us, myself included, spend a lot of time focused on challenges that need to be addressed today—in this minute—leaving less time for creativity and longer-range planning. The same is true for many organizations. When businesses are faced with downtime, traditional hardware restrictions, or have to adapt quickly to new changes afoot, it can limit productivity and stifle innovation.

What we hear from IT leaders is that digital transformation becomes a reality when they can go from doing their job despite technology limitations to innovating and delivering on priorities because of the technology they’re using—specifically global, cloud-based infrastructure.

In this episode, you’ll get a behind-the-scenes look at how three companies are using cutting-edge technologies like high-performance computing, Quantum, and AI to solve complex challenges, power innovation, and generate new kinds of business impact.

Driving innovation across industries with Azure

The episode is divided into three separate segments so you can watch them individually on-demand, at your convenience.

Part 1: Jeremy Smith and Karla Young on how Jellyfish Pictures virtualized their entire animation and visual effects studio with Azure

In this segment, you’ll hear from Jeremy Smith, CTO, and Karla Young, Head of PR, Marketing, and Communications at Jellyfish Pictures about how they create the amazing visuals we see in movies like How to Train Your Dragon: Homecoming, or some of the recent Star Wars films—both big favorites for my family! Using Azure high-performance computing to accelerate image rendering, they can spin up tens of thousands of cores at a moment’s notice and manage all that rich content securely in a single place, without replication.
Watch now: Virtualizing animation with Azure high-performance computing.

Part 2: Anita Ramanan and Viktor Veis on using quantum computing to address a complex scheduling challenge for NASA’s Jet Propulsion Laboratory

In the second segment, I’m joined by members of the Azure Quantum team—Anita Ramanan, Technical Program Manager Lead for Optimization in Azure Quantum, and Viktor Veis, Azure Quantum Group Software Engineering Manager—to talk about a project they worked on with NASA’s Jet Propulsion Laboratory. They share how they used quantum-inspired algorithms to create schedules for spacecraft communications in minutes rather than hours—and how Azure Quantum can address similar challenges in almost every industry, from manufacturing to healthcare.
Watch now: A quantum-inspired approach to scheduling communications in space.

Part 3: Alex Oelling on how Volocopter is powering an urban air mobility ecosystem of self-flying air taxis and drone services with Azure infrastructure and AI

In the third segment, I chat with Alex Oelling, Chief Digital Officer at Volocopter about how they are bringing urban air travel to life in major cities. A true pioneer in providing air taxi and drone services in urban environments, Volocopter is building a cloud-based solution to work with smart cities and existing mobility operations using Azure infrastructure and AI.
Watch now: Pioneering urban air travel in major cities with Azure infrastructure and AI.

When we launched Inside Azure for IT last July, our goal was to create a place where cloud professionals could come to learn Azure best practices and insights that would help them transform their IT operations. Whether you’ve tuned in for our live ask-the-experts sessions, watched deep-dive skilling videos, or joined us for fireside chats—we want to say "thank you" for engaging with us and bringing us your hardest questions.

Stay current with Inside Azure for IT

Beyond this latest episode, there are many more technical and cloud-skilling resources available through Inside Azure for IT. Learn more about empowering an adaptive IT environment with best practices and resources designed to enable productivity, digital transformation, and innovation. Take advantage of technical training videos and learn about implementing these scenarios.

Watch the free Azure Hybrid, Multicloud, and Edge Day event on-demand.
Watch past episodes of the Inside Azure for IT fireside chats.
Watch part 1: Virtualizing animation with Azure high-performance computing.
Watch part 2: A quantum-inspired approach to scheduling communications in space.
Watch part 3: Pioneering urban air travel in major cities with Azure infrastructure and AI.

Quelle: Azure

Securing the Software Supply Chain: Atomist Joins Docker

I’m excited to share some big news: Atomist is joining Docker. I know our team will thrive in its new home, and look forward to taking the great stuff we’ve built to a much larger audience.
I’ve devoted most of my career to trying to improve developer productivity and the development experience. Thus it’s particularly pleasing to me that Atomist is becoming part of a company that understands and values developers and has transformed developer experience for the better over the last 10 years. Docker’s impact on how we work has been profound and varied. Just a few of the ways I use it nearly every day: quickly spinning up and trying out a complex stack on my laptop without having to dread uninstallation; creating and destroying a database instance in seconds during CI to check the validity of a schema; confidently deploying my own code and third party products to production. Docker is both integral to development and a vital part of deployment. This is rare and makes it core to how we work.

What does this acquisition mean for users and customers?
First, Atomist’s technology can help Docker provide additional value throughout the delivery lifecycle. Docker will integrate Atomist’s rich understanding of the secure software supply chain into its products. To start with, this will surface in sophisticated reporting and remediation of container vulnerabilities. But that is just the start. As deployed software becomes more and more complex, it’s vital to understand what’s in production deployments and how it evolves over time. Container images are core to this, and Atomist’s ability to make sense of the supply chain both at any point in time and as it changes becomes ever more important. Security is just one application for this insight–although arguably the single most critical.
Second, Docker will leverage Atomist’s sophisticated integration platform. Docker (the company) understands that the modern development and delivery environment is heterogeneous. No single vendor can supply best of breed solutions for every stage, and it’s not in customers’ interests for them to do so. Atomist will help Docker customers understand what’s happening through the delivery flow, while preserving their ability to choose the products that best meet their needs.
Finally, Atomist’s automation technology will help Docker improve development experience in a variety of ways, driven by user input.
We’re proud to have built powerful, unique capabilities at Atomist. And we’re ready to take them to a much larger audience as Docker. This is an important point in a longer voyage, with the best yet to come. Want to be the first to experience the new features resulting from this combination? You can sign up for the latest updates by visiting this page. 
Quelle: https://blog.docker.com/feed/

Anzeige: Green IT gehört die Zukunft

IT hat einen größeren Emissionsausstoß als der Flugverkehr. Workshops und Seminare der Golem Karrierewelt zu den Themen Green IT und Nachhaltigkeit bieten Perspektiven auf Veränderungen. (Golem Karrierewelt, Server-Applikationen)
Quelle: Golem

Commerzbank has Reimagined the Customer Experience with Google Contact Center AI

Digital channels and on-demand banking have led customers to expect instant and helpful access to managing their finances, with minimal friction. Google Cloud built Contact Center AI (CCAI) and DialogFlow CX to help banks and other enterprises deliver these services, replacing phone trees or sometimes confusing digital menus with intelligent chatbots that let customers interact conversationally, just as they would with human agents. Leaders at Germany-based Commerzbank, which operates in over 50 countries, saw potential for these technologies to enhance customer experiences, providing more curated and helpful interactions that would build trust in and satisfaction with their brand. Commerzbank’s implementation speaks to how conversational artificial intelligence (AI) services can help businesses better serve customers, and in this article, we’ll explore their story and what their example means for your business. Commerzbank: Disrupting Customer Interactions with Google’s Contact Center AI and Dialog Flow CXTokyo, 7:00 AM. Vanessa is on a business trip in Japan, closing a new deal for her company, one of Commerzbank´s more than 30,000 corporate customers throughout Germany. She has been preparing for weeks, and is going through her points a final time in a downtown coffee shop. Glancing at her watch, she realizes she must leave immediately to get to the meeting.Intending to pay, she realizes the chip in her credit card is not functioning. Due to the time difference with Germany, Vanessa is now concerned she will not be able to contact someone from customer support. She opens the Commerzbank mobile app and contacts the customer center through chat. The access point she needs is available, but how can it help her most efficiently? Building excellent conversational experiencesCustomers like Vanessa need an answer right away. With that in mind, Commerzbank aims to provide customers with integrated support via the use of chatbots in the quest to deliver efficiency, high quality, and information consistency. This goal is where the Google Cloud virtual agent platform Dialogflow CX comes into play, providing us with an enormous number of features to build conversation dialogue through accurate intent recognition, a robust visual flow creator, and automated testing—all while significantly improving our time to market. In just nine weeks, the Commerzbank team set-up an agile proof-of-value project by developing a chatbot solution designed to deliver a reliable conversation experience. Commerz Direktservices Chatbot Agent is now able to identify the touchpoint the customer is using (App or Web) and detect more than 100 suitable FAQs and answer them properly. The Chatbot Agent also identifies leads and sales prospects, enabling it to provide support on open questions in relation to products and services, thus performing a graceful handover to the human agent with the enrichment of value parameters. Commerz Direktservices has also broadened the ability  of the Chatbot to handle different customer types (keyword-based vs. context-based customers) by constructing an intelligent dialog architecture that lets the Chatbot Agent flow elegantly through prompts and intent questioning.Commerzbank has integrated Google Dialogflow CX with Genesys Platform, helping to make use of the full capabilities of the existing contact center infrastructure and more efficiently orchestrate the incoming interactions. A very versatile architecture bridges the potential of Google Cloud with a variety of on-premise applications and components, while also providing system resiliency and supporting data security compliance. The support of the entire Google team has been invaluable to accelerate the bank’s journey to the cloud. Commerzbank is seeing a number of benefits as it expands its AI platform, including:Enhanced ability to deliver innovationImproved operational efficienciesBetter customer experience through reduced wait times and self-serve capabilities, leading to reduced churnGreater productivity for CommerzBank employees who are able to support customer queries with enriched Google CCAI data The creation of an integrated cross-channel strategyGoing beyond support into an active conversational experienceNow, Commerzbank wants to move beyond great customer support to continue to increase the value-add to the customer. Customers like Vanessa are looking for their bank to go the extra mile by optimizing their finances,  providing personalized financial products and solutions, and  offering more control over their investment portfolio, among other needs. With this in mind, Commerzbank aims to continue moving away from a scenario where chatbots are only passive entities waiting to be triggered, into a new and more innovative one whereby they become an active key enabler of enhanced customer interactions across the customer value chain. Commerzbank is already mapping active dialog paths to: Make tailored product suggestions to prospects, giving them the possibility to acquire a product that suits their particular needsIdentify customer requirements for financing or investment, inviting them to get advice and benefit from the existing opportunitiesGenerate prospects based on the business potential, thus providing the human agents with a framework to prioritize their interactions Commerzbank leaders anticipate the impact of this solution will be significant. It will let the company fulfill the first advisory touchpoint for financial needs and perform a fast conversation hand-over to specialists as soon as the customer requires it. As a result, leaders expect to exponentially increase conversion rates via more fruitful customer journeys.Helping Vanessa with a delightful customer experienceGoing back to Vanessa’s example: how can Commerzbank help Vanessa efficiently? When she contacts support through chat, the chatbot welcomes her and offers help with any question she may have. Vanessa explains the situation and the digital agent explains that delivering a replacement card would take many days, and that the most practical solution would be to activate a virtual debit card, e. g., with Google Pay on her phone. Vanessa gladly accepts this solution, prompting the Chatbot to deliver a short explanation on how to carry out the process, as well as two additional links: one for downloading the Google Pay App from the Google Play Store and another for digital self-service in the Commerzbank App, which she can intuitively use to synchronize the Commerzbank App and Google Pay. After just 5 minutes, Vanessa is able to pay comfortably using her phone and get to her meeting in time. This engagement is how Commerzbank wants to deliver digital customer experiences that fascinate their customers, allowing their customers to perform their daily banking activities faster, better, and easier. To learn more about how Google Cloud AI solutions can help your company, visit the product page or check out this report that explores the total economic impact of Google Cloud CCAI.Related ArticleHSBC deploys Dialogflow, easing call burden on policy expertsHSBC uses AI and machine learning to reduce the time employees spend on manually intensive queries and improve the consistency of policy …Read Article
Quelle: Google Cloud Platform

Azure IoT increases enterprise-level intelligent edge and cloud capabilities

For Microsoft Azure IoT, our approach is connecting devices at the edge to the cloud seamlessly and securely to help customers achieve desired business outcomes. At this year’s Embedded World 2022, we’ll share how our Azure IoT solutions are delivering enhanced device security, seamless cloud integration, and device certification.

One of the key ways we’re delivering cost-efficient and energy-efficient solutions to IoT customers at Embedded World is with new Arm64 support. Partners such as NXP, with i.MX 8M SoC processors, are bringing full Windows IoT Enterprise capabilities in a small footprint ideal for compact and fanless designs.

Arm64 for low-cost, low-power benefits without compromise

Following our preview of the NXP i.MX 8M BSP release on Windows IoT Enterprise earlier this year, we are extending Arm64 support on NXP I.MX8 for Windows 10 IoT Enterprise.

Windows on Arm was launched in 2017 to provide better battery life, always-online internet connectivity, and quick boot-up via a Microsoft OS experience running on hardware powered by Arm processors. As enterprise-level IoT deployment has evolved, today’s edge devices have greater demands for compute-intensive applications, such as rich graphics and grid computing.

That’s why we’re now bringing full Windows application compatibility to IoT to deliver low-power and low-cost benefits of Arm64 through a multi-year collaboration between Microsoft and NXP, an Industrial IoT provider. Customers can get started by downloading the i.MX 8M Public Preview BSP and user guide. Additional partners announcing support for Windows IoT on Arm64 with their devices include Reycom and Avnet.

Security at the edge

Cyberattacks on IoT devices and other connected technology can put businesses at risk. An attack can result in stolen IP or other highly valuable data, compromised regulatory status or certification, costly downtime, as well as complex financial and legal ramifications. The following security announcement is one more way Microsoft is helping ensure security is built into the foundation of IoT solutions from the start.

Edge Secured-core

Edge Secured-core is a trusted certification program helping customers select hardware that meets a higher security standard. Edge Secured-core, including Edge Secured-core for Windows IoT, brings this certification into the IoT Edge ecosystem, making it easier for companies to identify edge hardware that meets this higher bar in protecting data.

MCU Security Platform

Microsoft also has partnered with STMicroelectronics to jointly develop a security platform for MCUs enabling ST’s ultra-low-power STM32U5 microcontrollers (MCUs) to connect securely to Azure IoT cloud services. The STM32U5 with Trusted Firmware for Cortex-M (TF-M) has been independently certified to PSA Level 3 and SESIP Level 3, and the STSAFE secure element has been certified to Common Criteria EAL 5+.

The security platform is built on Microsoft’s production-ready Azure real-time operating system (RTOS) which has received EAL4+ Common Criteria security certification and PSA Level 1 certification. The offering leverages best-in-class security with Microsoft Defender for IoT, Device Update for IoT Hub, and Device Provisioning Services with X.509 Certificate management.

Enhanced Azure RTOS

As software solutions become more complex, robust RTOS become more important for seamless development. Microsoft announced three enhancements for Azure RTOS at Embedded World 2022.

Embedded Wireless Framework

The Embedded Wireless Framework defines a common set of APIs for wireless interfaces used in IoT. The application programming interface covers multiple wireless network protocols, including Wi-Fi and cellular, with their unique proprietary drivers. The Wireless Framework also allows users to reuse application code across different devices leveraging IoT.

Visual Studio Code for Embedded

Visual Studio and VS Code have recently added embedded capabilities to C++ scenarios, opening a previously untapped market of developers for those products. Developers can use VS and VS Code for embedded development with Azure RTOS, Free RTOS, and Zephr. Industry partnerships will continue to extend capabilities.

Connecting IoT devices to Azure with LwM2M

Microsoft has collaborated with several partners to enable bridging the LwM2M protocol to Azure IoT cloud services, offering greater flexibility for device builders designing for low-power and low-bandwidth optimized applications over low-power wide-area (LPWA) technologies such as NB-IoT. Device certification enforces security standards.

Azure Sphere and Rust for continual innovation

Azure Sphere previously enabled programming exclusively in C. However, Rust has become one of the most popular embedded developer languages due to the safety and development ease it provides. Rust decreases time to market and lowers risks associated with security vulnerabilities in customer application code. Azure Sphere is now previewing support for Rust, ensuring a safe IoT device from the silicon through the application and to the cloud. Developers interested in joining the preview or getting updates can contact Azure Sphere at Microsoft.

Expanding enterprise-level intelligent edge capabilities

Enhanced device security, seamless cloud integration, and device certification support the Microsoft approach of making intelligent edge devices connect seamlessly and securely to the intelligent cloud. Visit the Microsoft Azure IoT booth at Embedded World 2022 to learn more about these latest announcements.
Quelle: Azure