Microsoft’s newest sustainable datacenter region coming to Arizona in 2021

 

On our journey to become carbon negative by 2030, Microsoft is continually innovating and advancing the efficiency and sustainability of our cloud infrastructure, with a commitment to use 100 percent renewable energy in all of our datacenters and facilities by 2025. Today, we are taking a significant step toward that goal, revealing plans for our newest sustainable datacenter region in Arizona, which will become our West US 3 region.

 

Companies are not only digitally transforming their operations and products to become more sustainable—they’re also choosing partners with shared goals and values. In developing the new West US 3 region, we have water conservation and replenishment firmly in mind. Today, Microsoft announced an ambitious commitment to be water positive for our direct operations by 2030. We’re tackling our water consumption two ways: reducing our consumption and replenishing water in the regions we operate. Since we announced our plans to invest in solar energy in Arizona to build more sustainable datacenters last year, we have been working with the communities of El Mirage and Goodyear on water conservation, education and sustainability projects to support local priorities and needs.

 

 

Sustainable design delivering the full Microsoft cloud for global scale, security and reliability

Our datacenter design and operations will contribute to the sustainability of our Arizona facilities. In Arizona, we’re pursuing Leadership in Energy and Environmental Design (LEED) Gold certification, which will help conserve additional resources including energy and water, generate less waste and support human health. We’re also committed to zero waste-certified operations for this new region, which means a minimum of 90 percent of waste will be diverted away from landfills through reduction, reuse and recycling efforts.

The new datacenter region will deliver enterprise-grade cloud services, all built on a foundation of trust:

Microsoft Azure, an ever-expanding set of cloud services that offers computing, networking, databases, analytics, AI and IoT services.
Microsoft 365, the world’s productivity cloud that delivers best-of-breed productivity apps integrated through cloud services and delivered as part of an open platform for business processes.
Dynamics 365 and Power Platform, the next generation of intelligent business applications that enable organizations to grow, evolve and transform to meet the needs of customers and capture new opportunities.
Compliance, security and privacy, Microsoft offers more than 90 certifications and spends $1 billion every year on cybersecurity to address security at every layer of the cloud.

To support customer needs for high-availability and resiliency in their applications, the new region will also include Availability Zones, which are unique physical locations of datacenters with independent power, network, and cooling for additional tolerance to datacenter failures.

Our construction partner Nox Innovations is helping build these sustainable datacenters with the help of Microsoft HoloLens 2, Microsoft Dynamics 365 Remote Assist and Microsoft mixed reality partner solution VisualLive, to visualize building information modeling (BIM) data in the form of holograms and overlay the 3D assets in the context of the physical environment. VisualLive’s solution is powered by Azure Spatial Anchors, a new Azure mixed reality service that maps, persists and restores 3D experiences in the real-world, VisualLive’s solution. The hands-free and remote work environment enabled by HoloLens 2 and cloud services enables virtual collaboration that has led to greater efficiency, safety and accuracy.

Delivering renewable solar energy and replenishing water in Arizona

Our commitment in Arizona includes a sustainable datacenter design and operations as well as several local initiatives to support water conservation. First, Microsoft is collaborating with First Solar, an Arizona-headquartered global leader in solar energy, on their Sun Streams 2 photovoltaic (PV) solar power plant, which will offset the day one energy usage of the new campus, available in 2021, with solar energy once the facility is operational. Clean solar PV energy displaces the water needed in the traditional electricity generation process. First Solar’s lowest carbon solar PV technology does not require water to generate electricity and is ideally suited to meet the growing energy and water needs of arid, water-limited regions. By displacing conventional grid electricity in Arizona, First Solar’s Sun Streams 2 Project is expected to save 356 million liters of water annually.

Microsoft’s Arizona datacenters will use zero water for cooling for more than half the year, leveraging a method called adiabatic cooling, which uses outside air instead of water for cooling when temperatures are below 85 degrees Fahrenheit. When temperatures are above 85 degrees, an evaporative cooling system is used, which is similar to “swamp coolers” in residential homes. This system is highly efficient, using less electricity and a fraction of water used by other water-based cooling systems, such as cooling towers.

For the last year, we have also been investing in water conservation to have a longer-lasting impact on replenishing water in Arizona to sustain water levels in Lake Mead, with the goal of supporting the state to meet its Drought Contingency Plan Commitments. Microsoft’s investment in this project has also generated a one-to-one cash match from the Water Funder Initiative that will support the state’s efforts and further expand project impact. The project will benefit the Colorado River Indian Tribes, ultimately resulting in more water in Lake Mead and more efficient water infrastructure.

Lastly, Microsoft and Gila River Water Storage, LLC are recharging and replenishing groundwater levels in the Phoenix Active Management Area with long term storage credits dedicated to the cities of Goodyear and El Mirage to balance a portion of Microsoft’s future water use, contributing an estimated additional 610,000 cubic meters. Microsoft is also collaborating with The Nature Conservancy to support water conservation in the Verde River Basin, installing a new pipe in the leakiest part of the Eureka Ditch to increase resilience for local farmers.

Supporting local growth, opportunities in Arizona

Through our Datacenter Community Development initiative, we are actively engaged in El Mirage, Goodyear, and across Arizona to advance community priorities in education, workforce development and further community connection. These investments in local projects total more than $800,000 and employee volunteer time as well as community partnerships to clean up the Gila River, provide WiFi connectivity for 1,000 students across the Navajo Nation and support the expansion of Mathematics, Engineering, Science Achievement (MESA) to serve 1,500+ middle school and high students across Arizona. In addition, Microsoft is collaborating with two Maricopa Community Colleges, including Estrella Mountain Community College in Avondale and Glendale Community College in Glendale, to develop workforce training that prepares workers for jobs in the IT sector, including work in Microsoft datacenters.

The new datacenter region and related work is expected to create over 100 permanent jobs across a variety of functions, including mechanical engineers, electrical engineers and datacenter technicians, when the facilities are fully operational, and more than 1,000 construction jobs over the initial building phases. Once the datacenters are operating, they’re expected to have an annual economic impact of approximately $20 million across communities in Arizona.
Quelle: Azure

Best practices for using Docker Hub for CI/CD

According to the 2020 Jetbrains developer survey 44% of developers are now using some form of continuous integration and deployment with Docker Containers. We know a ton of developers have got this setup using Docker Hub as their container registry for part of their workflow so we decided to dig out the best practices for doing this and provide some guidance for how to get started. To support this we will be publishing a series of blog posts over the next few weeks to answer the common questions we see with the top CI providers.

We have also heard feedback that given the changes Docker introduced relating to network egress and the number of pulls for free users, that there are questions around the best way to use Docker Hub as part of CI/CD workflows without hitting these limits. This blog post covers best practices that improve your experience and uses a sensible consumption of Docker Hub which will mitigate the risk of hitting these limits and how to increase the limits depending on your use case. 

To get started, one of the most important things when working with Docker and really any CI/CD is to work out when you need to test with the CI or when you can do this locally. At Docker we think about how how developers work in terms of their inner loop (code, build, run, test) and their outer loop (push change, CI build, CI test, deployment) 

Before you think about optimizing your CI/CD, it is always important to think about your inner loop and how it relates to the outer loop (the CI). We know that most people aren’t a fan of ‘debugging via the CI’, so it is always better if your inner loop and outer loop are as similar as possible. To this end it can be a good idea to run unit tests as part of your docker build command by adding a target for them in your Dockerfile. That way as you are making changes and re-building locally you can run the same unit tests you would run in the CI on your local machine with a simple command. Chris wrote a blog post earlier in the year about Go development with Docker, this is a great example of how you can use tests in your Docker project and re-use them in the CI. This creates a shorter feedback loop on issues and reduces the amount of pulls and builds your CI needs to do.

Once you get into your actual outer loop and Docker Hub, there are a few things we can do to get the most of your CI and deliver the fastest Docker experience. 

Firstly and foremost stay secure! When you are setting up your CI make sure you are using a Docker Hub access token rather than your password, you can create new access tokens from your security page on Docker Hub. 

Once you have this and have added it to whatever secrets store is available on your platform you will want to look at when you decide to push and pull in your CI/CD along with where from depending on the change you are making. The first thing you can do here to reduce the build time and reduce your number of calls is make use of the Buildcache to reuse layers you have already pulled. This can be done on many platforms by using BuildX/buildkits caching functionality and whatever cache your platform provides.

The other change you may want to make is only have your release images go to DockerHub, this would mean setting up functions to push your PR images to a more local image store to be quickly pulled and tested rather than promoting them all the way to production.

We know there are a lot more tips and tricks for using Docker in CI but really looking at how to do this around the recent Hub rate changes we think these are the top things you can do.

If you are still finding you have issues with Pull limits once you are authenticated you can consider upgrading to either a Pro or a Team account. This will give you unlimited authenticated pulls from Docker Hub, along with giving you unlimited private repos and unlimited image retention. In the near future this will also include Image Scanning (powered by Snyk) on push of new images to Docker Hub.

Look out for the next blog post in the series about how to put some of these practices into place with Github actions and feel free to give us ideas of what CI providers you would like to see us covering by dropping us a message on Twitter @Docker.
The post Best practices for using Docker Hub for CI/CD appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/