Schnittstelle: PCIe Gen6 verdoppelt erneut Datenrate

Während es bald die erste Consumer-Plattform von AMD mit PCIe Gen4 gibt und 2021 bereits CPUs mit PCIe Gen5 erscheinen sollen, befindet sich PCIe Gen6 in Arbeit: Die Schnittstelle soll 64 GT/s pro Lane erreichen, was 128 GByte/s bei einem x16-Slot für Grafikkarten entspricht. (PC-Hardware, AMD Zen)
Quelle: Golem

First Microsoft cloud regions in Middle East now available

This blog post was co-authored by Paul Lorimer, Distinguished Engineer, Office 365.

Azure and Office 365 generally available today, Dynamics 365 and Power Platform available by end of 2019

Today, Microsoft Azure and Microsoft Office 365 are taking a major step together to help support the digital transformation of our customers. Both Azure and Office 365 are now generally available from our first cloud datacenter regions in the Middle East, located in the United Arab Emirates (UAE). Dynamics 365 and Power Platform, offering the next generation of intelligent business applications and tools, are anticipated to be available from the cloud regions in UAE by the end of 2019.

The opening of the new cloud regions in Abu Dhabi and Dubai marks the first time Microsoft will deliver cloud services directly from datacenter locations in UAE and expands upon Microsoft’s existing investments in the Gulf and the wider Middle East region. By delivering the complete Microsoft cloud – Azure, Office 365, and Dynamics 365 – from datacenters in a given geography, we offer scalable, highly available, and resilient cloud services for organizations while helping them meet their data residency, security, and compliance needs.

Our new cloud regions adhere to Microsoft’s trusted cloud principles and join one of the largest and most secure cloud infrastructures in the world, already serving more than a billion customers and 20 million businesses. Microsoft has deep expertise in data protection, security, and privacy, including the broadest set of compliance certifications in the industry, and we are the first cloud service provider in UAE to achieve the Dubai Electronic Security Center certification for its cloud services. Our continued focus on our trusted cloud principles and leadership in compliance means customers in the region can accelerate their digital transformation with confidence and with the foundation to achieve compliance for their own applications.

Local datacenter infrastructure stimulates economic development for both customers and partners alike, enabling companies, governments, and regulated industries to realize the benefits of the cloud for innovation, as well as bolstering the technology ecosystem that supports the innovation. We anticipate the cloud services delivered from UAE to have a positive impact on job creation, entrepreneurship, and economic growth across the region. The International Data Corporation (IDC) predicts that cloud services could bring more than half a million jobs to the Middle East, including the potential of more than 55,000 new jobs in UAE, between 2017 and 2022.

Microsoft also continues to help bridge the skills gap amongst the IT community and to enhance technical acumen for cloud services. Cloud Society, a Middle East and Africa focused program building upon Microsoft Learn, has trained over 150,000 IT professionals in MEA. The community will further benefit from the increased availability and performance of cloud services delivered from UAE to help realize enterprise benefits of cloud, upskill in migration, and more effectively manage their cloud infrastructure.

You can learn more by following these links: Microsoft Middle East and Africa News Center, Microsoft Azure United Arab Emirates, Microsoft Office 365, Microsoft Dynamics 365, and Microsoft Power Platform.
Quelle: Azure

Build, Share and Run Multi-Service Applications with Docker Enterprise 3.0

Modern applications can come in many flavors, consisting of different technology stacks and architectures, from n-tier to microservices and everything in between. Regardless of the application architecture, the focus is shifting from individual containers to a new unit of measurement which defines a set of containers working together – the Docker Application. We first introduced Docker Application packages a few months ago. In this blog post, we look at what’s driving the need for these higher-level objects and how Docker Enterprise 3.0 begins to shift the focus to applications.
Scaling for Multiple Services and Microservices
Since our founding in 2013, Docker – and the ecosystem that has thrived around it – has been built around the core workflow of a Dockerfile that creates a container image that in turn becomes a running container. Docker containers, in turn, helped to drive the growth and popularity of microservices architectures by allowing independent parts of an application to be turned on and off rapidly and scaled independently and efficiently. The challenge is that as microservices adoption grows, a single application is no longer based on a handful of machines but dozens of containers that can be divided amongst different development teams. Organizations are no longer managing a few containers, but thousands of them. A new canonical object around applications is needed to help companies scale operations and provide clear working models for how multiple teams collaborate on microservices.
At the same time, organizations are seeing different configuration formats emerge including Helm charts, Kubernetes YAML and Docker Compose files. It is common for organizations to have a mix of these as technology evolves, so not only are applications becoming more segmented, they are embracing multiple configuration formats.
Docker Applications are a way to build, share and run multi-service applications across multiple configuration formats. It allows you to bundle together application descriptions, components and parameters into a single atomic unit (either a file or directory) – building in essence a “container of containers”.

The application description provides a manifest of the application metadata, including the name, version and a description.
The application component consists of one or more service configuration files and can be a mix of Docker Compose, Kubernetes YAML and Helm chart files.
Finally, the application parameters define the application settings and make it possible to take the same application package to different infrastructure environments WITH adjustable fields.

Docker Applications are an implementation of the Cloud-Native Application Bundle (CNAB) specification – an open source standard originally co-developed by Docker, Microsoft, Hashicorp, Bitnami, and Codefresh and with more companies onboard today.
Docker Applications in Docker Enterprise 3.0
In Docker Enterprise 3.0, we begin to lay the groundwork for Docker Applications Services. You will be able to begin testing the ‘docker app’ CLI plugin with Docker Desktop Enterprise which provides a way to define applications. These are then pushed to either Docker Hub or Docker Trusted Registry for secure collaboration and integration to the CI/CD toolchain. With the latter, you can also perform a binary-level scan of the package against the NIST CVE database. Finally, the parameterized environment variables make it easy for operators to deploy these multi-service applications to different environments, making it possible to adjust things like ports used during deployment.

With Docker Enterprise 3.0, organizations can continue to operate individual containers, but also have the ability to shift the conversation to Docker Applications to scale more effectively.

Build, Share and Run Multi-Service Applications with #Docker Enterprise 3.0Click To Tweet

To learn more about Docker Enterprise 3.0:

Watch the on-demand webinar: What’s New in Docker Enterprise 3.0
The Docker Enterprise 3.0 public beta is just about to conclude, but you can learn more about Docker Application at https://github.com/docker/app.
Learn more about Docker Enterprise

The post Build, Share and Run Multi-Service Applications with Docker Enterprise 3.0 appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Safaricom: Harnessing the power of APIs to transform lives in Africa

Editors note: Today we hear from Calestor Kizito Magero, Safaricom’s API product development manager of M-PESA, the company’s mobile payment platform. Learn how Safaricom, the largest telecommunications provider in Kenya, uses Apigee to simplify how it integrates its mobile services with partners.Safaricom holds the distinction of being the largest telecommunications services provider in Kenya, but we’re aiming for an even loftier goal: empowering Kenyans with tools for economic growth. From venture capital investments in local startups to our commitment to United Nations (UN) sustainability goals, we prioritize the mission of transforming lives in our country.A key part of this mission is M-PESA, our mobile payment solution. M-PESA enables money transfers and lending, and empowers Kenyans to manage their finances by transforming their mobile phones into a personal bank branch. Partners can integrate with the service via APIs that are exposed via the Apigee API management platform.Integrating with partners wasn’t always as fast and efficient as it is now, though. Our previous channel had proven tedious, expensive, and time consuming, and we dealt with a lot of customer complaints and dissatisfaction. We had to create separate network connections for each partner to maintain security for our customers.We couldn’t develop APIs on the gateway layer, meaning that development had to be done on our core services. To do testing, we had to send requests manually to developers, which wasn’t feasible when we reached a scale of more than 100 integrations. We knew that the continued success of M-PESA hinged on finding a faster, easier, and more secure way to expose our APIs and get them integrated with partners’ offerings. A key reason we chose Google Cloud’s Apigee API Platform was the ability it provides to securely expose any API, whether external or internal. We also appreciated the platform’s configurability. With Apigee, it became easier to develop and deploy APIs from start to finish in just a few hours, along with necessary error handling and logs. Off-the-shelf tools like Apigee Trace and the platform‘s proxy building capability make API management very easy, and we value its ability to scale with us as the number of APIs we offer grows. Our implementation partner Abacus Consulting played a key role in evaluating Apigee and helping Safaricom implement the platform.Deploying Apigee has enabled our partners to easily integrate our M-PESA mobile payment solution. This has opened up our ecosystem to 4,500 partners and counting, ranging from startups to large enterprises. We now feel confident in being able to privately and more securely expose our APIs, which now take as little as a week to develop and publish. We have also added a valuable commercial aspect to our digital strategy thanks to the monetization feature in Apigee, which is contributing 11% of our B2B and B2C revenue at the beginning of 2019.We have also added a valuable commercial aspect to our digital strategy thanks to the monetization feature in Apigee, which is contributing 11% of our B2B and B2C revenue at the beginning of 2019. We have a very vibrant developer community across the country, and with the growth and adoption of M-PESA our customers need to easily integrate and automate payment processes. The API management platform has opened up an easy way for developers to create and embed payments into their solutions, with the sandbox offering a test area where they can experiment with different ways of handling payments.This has created a buzz in the developer community, with many collaborative knowledge-sharing groups forming spontaneously. So far, we have on-boarded over 15,000 developers in our sandbox environment. We are exposing more than 80 APIs now and have over 15,000 apps currently in production or in the sandbox.Success stories are bubbling up from our ecosystem about how easy it has been to integrate with M-PESA and automate payments since the Apigee deployment. Self-onboarding means that customers do not have to depend on Safaricom support engineers to get access to our APIs and documentation. We have also been able to implement self-testing and customer go-live very easily and more securely.With Apigee’s help, Safaricom has gained a significant competitive edge in Africa by becoming the first mobile network operator in the region to expose APIs. We are planning to leverage Apigee capabilities to further enhance our offering by developing new API-based offerings in the Internet of Things (IoT) space. We are at the beginning of our API journey and we are excited by the potential that APIs unlocks to help us on our mission to transform lives through mobile communications.To learn more about API management on Google Cloud, visit our Apigee page.
Quelle: Google Cloud Platform

Using Azure Search custom skills to create personalized job recommendations

This blog post was co-authored by Kabir Khan, Software Engineer II , Learning Engineering Research and Developement.

The Microsoft Worldwide Learning Innovation lab is an idea incubation lab within Microsoft that focuses on developing personalized learning and career experiences. One of the recent experiences that the lab developed focused on offering skills-based personalized job recommendations. Research shows that job search is one of the most stressful times in someone’s life. Everyone remembers at some point looking for their next career move and how stressful it was to find a job that aligns with their various skills.

Harnessing Azure Search custom skills together with our library of technical capabilities, we were able to build a feature that offers personalized job recommendations based on identified capabilities from resumes. The feature parses a resume to identify technical skills (highlighted and checkmarked in the figure below.) It then ranks jobs based on the skills most relevant to the capabilities in the resume. Another helpful ability is in the UI layout, where the user can view the gaps in their skills (non-highlighted skills in the figure below) for jobs they’re interested in and work towards building those skills.

Figure one: Worldwide Learning Personalized Jobs Search Demo UI

In this example, our user is interested in transitioning from a Software Engineering role to Program Management. Displayed in the image above you can see how the top jobs for our user are in Program Management but they are ranked based on our user’s unique capabilities in areas like AI, Machine Learning and Cloud Computing, resulting in the top ranked job on the Bing Search and AI team which deals with all three.

How we used Azure Search

Figure two: Worldwide Learning Personalized Jobs Search Architecture

The above architecture diagram shows the data flow for our application. We started with around 2000 job openings pulled directly from the Microsoft Careers website as an example. We then indexed these jobs, adding a custom Azure Search cognitive skill to extract capabilities from the descriptions of each job. This allows a user to search for a job based on a capability like “Machine Learning”. Then, when a user uploads a resume, we upload it to Azure Blob storage and run an Azure Search indexer. Leveraging a mix of cognitive skills provided by Azure and our custom skill to extract capabilities, we end up with a good representation of the user’s capabilities.

To personalize the job search, we leverage the tag boosting scoring profile built into Azure Search. Tag boosting ranks search results by the user’s search query and the number of matching “tags” (in this case capabilities) with the target index. So, in our example, we pass the user’s capabilities along with their search query and get jobs that best match our user’s unique set of capabilities.

With Azure Search skills, our team was able to make the personalization of job search, a desirable capability among job seekers and recruiters, possible through this proof of concept. You can use the same process we followed to achieve the same goal for your own careers site. We open sourced the Skills Extractor Library that we used in this example and made it available in a container.

Please be aware that before running this sample, you must have the following:

Install the Azure CLI. This article requires the Azure CLI version 2.0 or later. Run az –version to find the version you have.
You can also use the Azure Cloud Shell.

To learn more about this feature, you can view the live demo (starts at timecode 00:50:00) and read more in our GitHub repository.

Feedback and support

We’re eager to improve, so please take a couple of minutes to answer some questions about your experience using this survey. For support requests, please contact us at WWL_Skills_Service@microsoft.com.
Quelle: Azure