Q&A: Steven Gonzalez on Indigenous futurist science fiction

Steven Gonzalez is a PhD candidate in the MIT Doctoral Program in History, Anthropology, Science, Technology, and Society (HASTS), where he researches the environmental impacts of cloud computing and data centers in the United States, Iceland, and Puerto Rico. He is also an author. Writing under the name E.G. Condé, he recently published his first book, “Sordidez.” It’s described as an “Indigenous futurist science fiction novella set in Puerto Rico and the Yucatán.” Set in the near future, it follows the survivors of civil war and climate disaster led by protagonist Vero Diaz, as they reclaim their Indigenous heritage and heal their lands.

In this Q&A, Gonzalez describes the book’s themes, its inspirations, and its connection to research, people, and classes at MIT.

Q: Where did the inspiration for this story come from?

A: I actually began my time at MIT in September of 2017 when Hurricane María struck. It was a really difficult time for me at the Institute, starting a PhD program. And it’s MIT, so there’s a lot of pressure. I was still kind of navigating the new institutional space and trying to understand my place in it. But I had a lot of people at the Institute who were extremely supportive during that time. I had family members in Puerto Rico who were stranded as a result of the hurricane, who I didn’t hear from for a very long time — who I feared dead. It was a very, very chaotic, confusing, and emotionally turbulent time for me, and also incredibly difficult to be trying to be present in a PhD program for the first semester. Karen Gardner, our administrator, was really incredibly supportive in that. Also the folks at the MIT Association of Puerto Ricans, who hosted fundraisers and linked students with counseling resources. But that trauma of the hurricane and the images that I saw of the aftermath of the hurricane, specifically in the town where my grandmother’s house was where I spent time living as a child during the summers, and to me, it was the greenest place that I have ever known. It looked like somebody had torched the entire landscape. It was traumatizing to see that image. But that kind of seeded the idea of, is there a way to burn without fire? There’s climate change, but there’s also climate terror. And so that was sort of one of the premises of the book explores, geoengineering, but also the flip side of geoengineering and terraforming is, of course, climate terror. And in a way, we could frame what’s been happening with the fossil fuel industry as a form of climate terror, as well. So for me, this all began right when I started at MIT, these dual tracks of thought.

Q: What do you see as the core themes of your novella?

A: One major theme is rebuilding. As I said, this story was very influenced by the trauma of Hurricane María and the incredibly inspiring accounts from family members, from people in Puerto Rico that I know, of regular people stepping up when the government — both federal and local — essentially abandoned them. There were so many failures of governance. But people stepped up and did what they could to help each other, to help neighbors. Neighbors cleared trees from roads. They banded together to do this. They pooled resources, to run generators so that everyone in the same street could have food that day. They would share medical supplies like insulin and things that were scarce. This was incredibly inspiring for me. And a huge theme of the book is rebuilding in the aftermath of a fictive hurricane, which I call Teddy, named after President Theodore Roosevelt, where Puerto Rico’s journey began as a U.S. commonwealth or a colony.

Healing is also a huge theme. Healing in the sense of this story was also somewhat critical of Puerto Rican culture. And it’s refracted through my own experience as a queer person navigating the space of Puerto Rico as a very kind of religious and traditional place and a very complex place at that. The main character, Vero, is a trans man. This is a person who’s transitioned and has felt a lot of alienation and as a result of his gender transition, a lot of people don’t accept him and don’t accept his identity or who he is even though he’s incredibly helpful in this rebuilding effort to the point where he’s, in some ways, a leader, if not the leader. And it becomes, in a way, about healing from the trauma of rejection too. And of course, Vero, but other characters who have gone through various traumas that I think are very much shared across Latin America, the Latin American experiences of assimilation, for instance. Latin America is a very complex place. We have Spanish as our language, that is our kind of lingua franca. But there are many Indigenous languages that people speak that have been not valued or people who speak them or use them are actively punished. And there’s this deep trauma of losing language. And in the case of Puerto Rico, the Indigenous language of the Taínos has been destroyed by colonialism. The story is about rebuilding that language and healing and “becoming.” In some ways, it’s about re-indigenization. And then the last part, as I said, healing, reconstruction, but also transformation and metamorphosis. And becoming Taíno. Again, what does that mean? What does it mean to be an Indigenous Caribbean in the future? And so that’s one of the central themes of the story.

Q: How does the novella intersect with the work you’re doing as a PhD candidate in HASTS?

A: My research on cloud computing is very much about climate change. It’s pitched within the context of climate change and understanding how our digital ecosystem contributes to not only global warming, but things like desertification. As a social scientist, that’s what I study. My studies of infrastructure are also directly referenced in the book in a lot of ways. For instance, the now collapsed Arecibo Ionosphere Observatory, where some of my pandemic fieldwork occurred, is a setting in the book. And also, I am an anthropologist. I am Puerto Rican. I draw both from my personal experience and my anthropological lens to make a story that I think is very multicultural and multilingual. It’s set in Puerto Rico, but the other half is set in the Yucatán Peninsula in what we’ll call the former Maya world. And there’s a lot of intersections between the two settings. And that goes back to the deeper Indigenous history. Some people are calling this Indigenous futurism because it references the Taínos, who are the Indigenous people of Puerto Rico, but also the Mayas, and many different Maya groups that are throughout the Yucatán Peninsula, but also present-day Guatemala and Honduras. And the story is about exchange between these two worlds. As someone trained as an anthropologist, it’s a really difficult task to kind of pull that off. And I think that my training has really, really helped me achieve that.

Q: Are there any examples of ways being among the MIT community while writing this book influenced and, in some ways, made this project possible?

A: I relied on many of my colleagues for support. There’s some sign language in the book. In Puerto Rico, there’s a big tradition of sign language. There’s a version of American sign language called LSPR that’s only found in Puerto Rico. And that’s something I’ve been aware of ever since I was a kid. But I’m not fluent in sign language or deaf communities and their culture. I got a lot of help from Timothy Loh, who’s in the HASTS program, who was extremely helpful to steer me towards sensitivity readers in the deaf community in his networks. My advisor, Stefan Helmreich, is very much a science fiction person in a lot of ways. His research is on the ocean waves, the history and anthropology of biology. He’s done ethnography in deep-sea submersibles. He’s always kind of thinking in a science fictional lens. And he allowed me, for one of my qualifying exam lists, to mesh science fiction with social theory. And that was also a way that I felt very supported by the Institute. In my coursework, I also took a few science fiction courses in other departments. I worked with Shariann Lewitt, who actually read the first version of the story. I workshopped it in her 21W.759 (Writing Science Fiction) class, and got some really amazing feedback that led to what is now a publication and a dream fulfilled in so many ways. She took me under her wing and really believed in this book.
Quelle: Massachusetts Institute of Technology

Optimize the cost of .NET and Java application migration to Azure cloud

In today’s uncertain economic environment, cost is top of mind for every organization. With uncertain global economic conditions, high inflation rates, and challenging job markets, many businesses are tightening their spending. Yet, companies continue to prioritize substantial budget allocations for digital transformation, especially for the agility, performance, and security gained by migrating applications to the cloud. The reason is simple: investments in cloud translate to positive impacts on the business revenue and significant cost savings.  

But how do businesses turn this opportunity into reality? In this article, we’ll look at  several levers that Azure provides to help organizations maximize the cost benefits of migrating .NET and Java apps to the cloud. One of the things to note about cost optimization is that it’s not only about the price. There are significant financial benefits to be gained when you leverage the right technical resources, have access to best practices from real-world experiences with thousands of customers, and flexibility of the right pricing option for any scenario. These factors may result in a compelling total cost of ownership (TCO).  

Let’s look at some of these benefits for Azure App Service customers below: 

Azure landing zone accelerators

Enterprise web app patterns

Powerful Azure Migrate automation tooling

Offers to offset the initial cost of migration

Cost-effective range of pricing plans

Faster time to value with expert guidance through landing zone accelerators  

For cloud migration projects, getting it right quickly from the start sets the foundation for business success and savings. Azure landing zone accelerators are prescriptive solution architectures and guidance that aid IT pros in preparing for migration and deployment of on-premises apps to the cloud.  

Provided at no additional cost and capturing the expert guidance from migrations done with thousands of customers, landing zone accelerators are a compelling differentiator with Azure that help organizations focus on delivering value rather than spend cycles doing the heavy lifting of migration on their own. Based on well-architected principles and industry best-practices for securing and scaling application and platform resources, these resources create tangible cost savings by reducing the time and effort to complete app migration projects.  

Learn more about other landing zone accelerator workloads, and watch the Azure App Service landing zone accelerator demo. 

Enhance developer skilling with the reliable enterprise web app pattern

The reliable web app (RWA) pattern is another free resource from Azure that is specifically designed to empower developers confidently plan and execute the migration process. It is targeted at both experts in the cloud and developers who may be more familiar with on-premises tools and solutions and taking their first steps in the cloud. Built on the Azure Well-Architected Framework, this set of best practices helps developers successfully migrate web applications to the cloud and establishes a developer foundation for future innovations on Azure. We are pleased to announce that a reliable web app pattern for Java is now available, in addition to the .NET pattern announced at Build.

The reliable web app pattern provides guidance on the performance, security, operations, and reliability of web applications with minimal changes during the migration process. It smoothens the learning curve and greatly reduces the length of the migration project, thereby saving organizations the cost of maintaining on-premises infrastructure any longer. The Azure Architecture Center provides comprehensive guidance, open-source reference implementation code, and CI/CD pipelines on GitHub. Check out the free, on-demand Microsoft Build 2023 session to learn more. 

Accelerate the end-to-end migration journey with free automated tooling  

Costs of tooling and automation are often underestimated during migration projects. Azure Migrate is a free Microsoft tool for migrating and modernizing in Azure. It provides discovery, assessment, business case analysis, planning, migration, and modernization capabilities for various workloads on premises—all while allowing developers to run and monitor the proceedings from a single secure portal. Watch this short demo of the business case feature, and find Azure Migrate in the portal to get started.

Azure Migrate, Azure Advisor, and Azure Cost Management and Billing are components of this migration journey that provide guidance, insights, and the ability to right-size Azure resources for optimal cost-efficiency. 

Offset the initial cost of migration projects with Azure offerings

To alleviate risks and help jumpstart migration with confidence, Azure Migrate and Modernize partner offers are available to customers. It not only helps build a sustainable plan to accelerate the cloud journey with the right mix of best practices, resources, and extensive guidance at every stage, but may also include agile funding to off-set the initial costs.  

With Azure Migrate and Modernize, moving to the cloud is efficient and cost-optimized with free tools like Azure Migrate and Azure Cost Management. Additionally, it supports environmentally sustainable outcomes and drives operational efficiencies, while reducing migration costs through tailored offers and incentives based on your specific needs and journey. Work with your Microsoft partner to take advantage of these offers in your enterprise app migration.  

Benefit from a wide range of flexible and cost-effective plans

Azure App Service is one of the oldest and most popular destinations for .NET and Java app migrations, with over two and a half million web apps and growing fast. It offers a wide range of flexible pricing options to save on compute costs. Azure Savings Plan for Compute is ideal if the flexibility to run dynamic workloads across a variety of Azure services is crucial. Reserved instances are another popular option, providing substantial cost savings for workloads with predictable resource needs. There are various pricing plans and tiers to suit every budget and need—from a new entry-level Premium v3 plan called P0v3, to large-scale plans that support up to 256GB memory. For hobbyists and learners, Azure App Service has one of the most compelling free tiers that continues to attract new developers every day.  

Check out the Azure App Service pricing page and pricing calculator to learn more.  

Learn more

Interested in learning more? Dive deeper into the cost optimization strategies and see how other organizations have optimized their cost of migration with the following papers: 

Save up to 54 percent versus on-premises and up to 35 percent versus Amazon Web Services by migrating to Azure.

Forrester study finds 228 percent ROI when modernizing applications on Azure PaaS.

Plan to manage costs for App Service.

Read our customer stories, including from the NBA, a leading sports association in United States, and Nexi, a leading European payment technology company.  

Follow Azure App Service on Twitter.
The post Optimize the cost of .NET and Java application migration to Azure cloud appeared first on Azure Blog.
Quelle: Azure

Scale generative AI with new Azure AI infrastructure advancements and availability

Generative AI is a powerful and transformational technology that has the potential to advance a wide range of industries from manufacturing to retail, and financial services to healthcare. Our early investments in hardware and AI infrastructure are helping customers to realize the efficiency and innovation generative AI can deliver. Our Azure AI infrastructure is the backbone of how we scale our offerings, with Azure OpenAI Service at the forefront of this transformation, providing developers with the systems, tools, and resources they need to build next-generation, AI-powered applications on the Azure platform. With generative AI, users can create richer user experiences, fuel innovation, and boost productivity for their businesses.  

As part of our commitment to bringing the transformative power of AI to our customers, today we’re announcing updates to how we’re empowering businesses Azure AI infrastructure and applications. With the global expansion of Azure OpenAI Service, we are making OpenAI’s most advanced models, GPT-4 and GPT-35-Turbo, available in multiple new regions, providing businesses worldwide with unparalleled generative AI capabilities. Our Azure AI infrastructure is what powers this scalability, which we continue to invest in and expand. We’re also delivering the general availability of the ND H100 v5 Virtual Machine series, equipped with NVIDIA H100 Tensor Core graphics processing units (GPUs) and low-latency networking, propelling businesses into a new era of AI applications. 

Here’s how these advancements extend Microsoft’s unified approach to AI across the stack.  

General availability of ND H100 v5 Virtual Machine series: Unprecedented AI processing and scale

Today marks the general availability of our Azure ND H100 v5 Virtual Machine (VM) series, featuring the latest NVIDIA H100 Tensor Core GPUs and NVIDIA Quantum-2 InfiniBand networking. This VM series is meticulously engineered with Microsoft’s extensive experience in delivering supercomputing performance and scale to tackle the exponentially increasing complexity of cutting-edge AI workloads. As part of our deep and ongoing investment in generative AI, we are leveraging an AI optimized 4K GPU cluster and will be ramping to hundreds of thousands of the latest GPUs in the next year. 

The ND H100 v5 is now available in the East United States and South Central United States Azure regions. Enterprises can register their interest in access to the new VMs or review technical details on the ND H100 v5 VM series at Microsoft Learn.  

The ND H100 v5 VMs include the following features today: 

AI supercomputing GPUs: Equipped with eight NVIDIA H100 Tensor Core GPUs, these VMs promise significantly faster AI model performance than previous generations, empowering businesses with unmatched computational power.

Next-generation computer processing unit (CPU): Understanding the criticality of CPU performance for AI training and inference, we have chosen the 4th Gen Intel Xeon Scalable processors as the foundation of these VMs, ensuring optimal processing speed.

Low-latency networking: The inclusion of NVIDIA Quantum-2 ConnectX-7 InfiniBand with 400Gb/s per GPU with 3.2 Tb/s per VM of cross-node bandwidth ensures seamless performance across the GPUs, matching the capabilities of top-performing supercomputers globally.

Optimized host to GPU performance: With PCIe Gen5 providing 64GB/s bandwidth per GPU, Azure achieves significant performance advantages between CPU and GPU.

Large scale memory and memory bandwidth: DDR5 memory is at the core of these VMs, delivering greater data transfer speeds and efficiency, making them ideal for workloads with larger datasets.

These VMs have proven their performance prowess, with up to six times more speedup in matrix multiplication operations when using the new 8-bit FP8 floating point data type compared to FP16 in previous generations. The ND H100 v5 VMs achieve up to two times more speedup in large language models like BLOOM 175B end-to-end model inference, demonstrating their potential to optimize AI applications further.

Azure OpenAI Service goes global: Expanding cutting-edge models worldwide

We are thrilled to announce the global expansion of Azure OpenAI Service, bringing OpenAI’s cutting-edge models, including GPT-4 and GPT-35-Turbo, to a wider audience worldwide. Our new live regions in Australia East, Canada East, East United States 2, Japan East, and United Kingdom South extend our reach and support for organizations seeking powerful generative AI capabilities. With the addition of these regions, Azure OpenAI Service is now available in even more locations, complementing our existing availability in East United States, France Central, South Central United States, and West Europe. The response to Azure OpenAI Service has been phenomenal, with our customer base nearly tripling since our last disclosure. We now proudly serve over 11,000 customers, attracting an average of 100 new customers daily this quarter. This remarkable growth is a testament to the value our service brings to businesses eager to harness the potential of AI for their unique needs.

As part of this expansion, we are increasing the availability of GPT-4, Azure OpenAI’s most advanced generative AI model, across the new regions. This enhancement allows more customers to leverage GPT-4’s capabilities for content generation, document intelligence, customer service, and beyond. With Azure OpenAI Service, organizations can propel their operations to new heights, driving innovation and transformation across various industries.

A responsible approach to developing generative AI

Microsoft’s commitment to responsible AI is at the core of Azure AI and Machine Learning. The AI platform incorporates robust safety systems and leverages human feedback mechanisms to handle harmful inputs responsibly, ensuring the utmost protection for users and end consumers. Businesses can apply for access to Azure OpenAI Service and unlock the full potential of generative AI to propel their operations to new heights.

We invite businesses and developers worldwide to join us in this transformative journey as we lead the way in AI innovation. Azure OpenAI Service stands as a testament to Microsoft’s dedication to making AI accessible, scalable, and impactful for businesses of all sizes. Together, let’s embrace the power of generative AI and Microsoft’s commitment to responsible AI practices to drive positive impact and growth worldwide.

Customer inspiration

Generative AI is revolutionizing various industries, including content creation and design, accelerated automation, personalized marketing, customer service, chatbots, product and service innovation, language translation, autonomous driving, fraud detection, and predictive analytics. We are inspired by the way our customers are innovating with generative AI and look forward to seeing how customers around the world build upon these technologies.

Mercedes-Benz is innovating its in-car experience for drivers, powered by Azure OpenAI Service. The upgraded “Hey Mercedes” feature is more intuitive and conversational than ever before. KPMG, a global professional services firm, leverages our service to improve its service delivery model, achieve intelligent automation, and enhance the coding lifecycle. Wayve trains large scale foundational neural-network for autonomous driving using Azure Machine Learning and Azure’s AI infrastructure. Microsoft partner SymphonyAI launched Sensa Copilot to empower financial crime investigators to combat the burden of illegal activity on the economy and organizations. By automating data collection, collation, and summarization of financial and third-party information, Sensa Copilot identifies money laundering behaviors and facilitates quick and efficient analysis for investigators. Discover all Azure AI and ML customer stories. 

Learn more

Resources and getting started with Azure AI  

Azure AI Portfolio 

Explore Azure AI. 

Azure AI Infrastructure 

Apply now for NDH100 v5 Virtual Machine Series.

Review Azure AI Infrastructure documentation. 

Read more about Microsoft AI at Scale. 

Read more about Azure AI Infrastructure.

Azure OpenAI Service 

Apply now for access to Azure OpenAI Service. 

Apply now for access to GPT-4. 

Review Azure OpenAI Service documentation.

Explore the playground and customization in Azure AI Studio.  

The post Scale generative AI with new Azure AI infrastructure advancements and availability appeared first on Azure Blog.
Quelle: Azure

7 ways generative AI is bringing bionic business to manufacturing

Generative AI is transforming what we know, and when we know it. Fast access to knowledge is being used in the world of manufacturing, where AI’s ability to design, customize, and accurately predict potential defects allows businesses to optimize costs. Microsoft, a global technology leader, has strategically positioned itself at the forefront of the manufacturing industry revolution, employing a potent combination of its strong partnerships, cutting-edge cloud services, and revolutionary technologies like Azure Open AI Service, Internet of Things (IoT), and mixed reality. The company’s visionary approach revolves around empowering manufacturers with intelligent, interconnected systems that revolutionize productivity, enhance product quality, and optimize operational efficiency, thereby driving the industry toward unprecedented levels of success and innovation. 

The impact of generative AI

By fostering strategic alliances with key players across the manufacturing ecosystem, Microsoft has cultivated a collaborative environment that fuels creativity and cooperation. Through these partnerships, the tech giant gains valuable insights into industry pain points and emerging challenges, enabling them to develop tailor-made solutions that cater to the specific needs of manufacturers worldwide.  

Below, we take a look under the hood of generative AI’s transformational prowess.

Collect and leverage data—Strabag SE, the global construction company, partnered with Microsoft to build a Data Science Hub to collect decentralized data and leverage it for insights. This enabled the organization to develop use cases to prove the value of data including its risk management project. The solution uses an algorithm to pinpoint at-risk construction projects, saving Strabag SE time and reducing financial losses.  

Product customization—By leveraging customer input and preferences, manufacturers can use generative AI algorithms to create personalized designs or adapt existing designs to suit specific needs, thereby enhancing customer satisfaction and meeting diverse market demands without compromising efficiency. 

Process optimization—Generative AI can identify patterns, inefficiencies, and potential improvements, leading to enhanced productivity, reduced waste, and optimized resource allocation. By continuously learning from real-time data, generative AI can adapt and optimize production systems to maximize output and minimize costs.

Rapid prototyping—Generative AI can explore a vast design space, providing innovative solutions that might not be immediately apparent to the human eye. Modern Requirements built their solution on Microsoft Azure DevOps and integrated with Azure OpenAI Service, providing the essential requirements tools to effectively manage projects throughout their life cycles. Doing so allowed them to reduce time to market and improve project quality across a multitude of industries—all of which require regulatory compliance.  

Quality control—Generative AI can assist in quality control processes by analyzing large volumes of data collected during production. By identifying patterns and correlations, it can detect anomalies, predict potential defects, and provide insights into quality issues. Manufacturers can use this information to implement preventive measures, reduce product defects, and enhance overall product quality.  

Supply chain optimization—Generative AI can optimize supply chain operations by analyzing historical data, demand forecasts, and external factors. It can generate optimized production schedules, predict demand fluctuations, and optimize inventory levels. This helps manufacturers minimize stockouts, reduce lead times, and improve overall supply chain efficiency. 

Maintenance and predictive analytics—Generative AI can analyze real-time sensor data from manufacturing equipment to identify potential failures or maintenance needs. By detecting patterns and anomalies, it can predict equipment failures, schedule maintenance proactively, and optimize maintenance processes. This approach helps reduce downtime, improve equipment reliability, and increase overall operational efficiency.

Microsoft aims to enable seamless connectivity, data analysis, and AI-driven insights across the production process. By leveraging Azure OpenAI Service’s capabilities, manufacturers can optimize production operations, improve equipment maintenance, and enhance product quality.

Our commitment to responsible AI 

Microsoft has a layered approach for generative models, guided by the Microsoft AI Principles. In Azure OpenAI, an integrated safety system provides protection from undesirable inputs and outputs and monitors for misuse. In addition, Microsoft provides guidance and best practices to help customers responsibly build applications using these models and expects customers to comply with the Azure OpenAI Code of Conduct.  

Get started with Azure OpenAI Service 

Apply for access to Azure OpenAI Service by completing this form. 

Learn about Azure OpenAI Service and the latest enhancements. 

Get started with OpenAI GPT-4 in Azure OpenAI Service in Microsoft Learn. 

Read our Partner announcement blog, ”Empowering partners to develop AI-powered apps and experiences with ChatGPT in Azure OpenAI Service.” 

Learn how to use the new Chat Completions API (preview) and model versions for ChatGPT and GPT-4 models in Azure OpenAI Service. 

The post 7 ways generative AI is bringing bionic business to manufacturing appeared first on Azure Blog.
Quelle: Azure

Efficiently store data with Azure Blob Storage Cold Tier — now generally available

We are excited to announce the general availability of Azure Blob Storage Cold Tier in all public and Azure Government regions except Poland Central and Qatar Central. Azure Blob Storage Cold Tier is an online tier specifically designed for efficiently storing data that is infrequently accessed or modified, all while ensuring immediate availability.

Azure Blob Storage is optimized for storing massive amounts of unstructured data. With blob access tiers, you can store your data in the most cost-effective way, based on how frequently it will be accessed and how long it will be retained. Azure Blob Storage now includes a new cold online access tier option, further reducing costs.

Across diverse industries, Azure customers are harnessing the power of blob storage to address a wide range of needs. With the introduction of the new tier, customers can now experience remarkable benefits in scenarios such as backing up media content, preserving medical images, and securing critical application data for seamless business continuity and robust disaster recovery.

Cost effectiveness with cold tier

Cold tier is the most cost-effective Azure Blob Storage offering to store infrequently accessed data with long term retention requirements, while maintaining instant access. Blob access tiers maximize cost savings based on data access patterns. When your data isn’t needed for 30 days, we recommend tiering the data from hot to cool to save up to 46 percent (using the East US 2 pricing as an example) due to lower prices on capacity. When your data is even cooler, for example, if you don’t require access for 90 days or longer, cold tier results in more savings. When compared to cool tier, cold tier can save you an additional 64 percent on capacity costs (using the East US 2 pricing as an example; hot to cold tier savings are 80 percent). See detailed prices in Blob storage pricing and Azure Data Lake Storage Gen2 pricing.

Prices for read operations are higher on cooler access tiers and read patterns and file size distribution affect the cost-effectiveness. We recommend calculating the total cost based on both operation and capacity costs. The chart below shows how total cost differs on cool tier and cold tier based on how long you keep the data with the tier.

In the above scenario, the total cost estimation assumes 10 TiB data in total, 10 MiB blob size on average, reading once every month, and reading 10 percent of the total data each time. If you keep data for 30 days, the total cost is lower on the cool tier than the cold tier. If you keep data for 60 days or longer, cold tier is a more cost-effective option.

See detailed guidance on how to calculate total cost with access tiers in choose the most cost-efficient access tiers documentation.

Seamless experience with cold tier

Cold tier is as easy to use as hot tier and cool tier. REST APIs, SDKs, Azure Portal, Azure PowerShell, Azure CLI, Azure Storage Explorer have been extended to support cold tier. You can use the latest version of these clients to write, manage, and read your data directly from cold tier. The read latency is milliseconds on the cold tier.

Lifecycle management policy also supports automating tiering blobs to cold tier based on conditions including modified time, creation time, and last access time. See more in the Blob lifecycle policy.

The cold tier extends its support to both Blob Storage and Azure Data Lake Storage Gen2. Locally redundant storage (LRS), Geo-redundant storage (GRS), Read-access Geo-redundant storage (RA-GRS), Zone-redundant storage (ZRS), Geo-zone-redundant storage (GZRS), and Read-access geo-zone-redundant storage (RA-GZRS) are all supported based on regional redundancy availability. See more in Azure Storage Data redundancy.

There are some features that are not yet compatible with cold tier. Check the latest cold tier limitations to ensure compatibility with your scenario.

Empowering customers and partners to maximize savings

Customers across industries can use cold tier to improve the cost efficiency of object storage on Azure without compromising read latency. Since we launched the preview for cold tier in May 2023, our customers and partners have used this capability on Azure to store data that is infrequently accessed or modified. Here are some quotes from customers and partners:

“AvePoint leverages multiple cloud storage types to provide cost-effective and intelligent tiering solutions for our customers. The inclusion of Azure Blob cold tier storage in our storage architecture is a significant enhancement that has the potential to boost our future return on investment. We are thrilled to witness the general availability of this service as it empowers us to provide even greater flexibility to our customers.” — George Wang, Chief Architect at AvePoint.

“Commvault is committed to ensuring customers can take advantage of latest advancements on Azure Blob Storage for their enterprise data protection & management needs. We are proud to support cold tier as a storage option with Commvault Complete and our Metallic SaaS offering later this year. Commvault’s unique compression and data packing approach, integrated with cold tier’s policy-based tiering and cost-efficient retention, empowers customers to efficiently defend and recover against ransomware, all while ensuring compliance and cost-efficient, on-demand access to data.” — David Ngo, Chief Technology Officer, Commvault.

“Embracing Azure Blob cold tier storage, MediaValet empowers customers with the power of instant retrieval, even for rarely accessed digital assets that are conventionally archived, eliminating workflow disruptions and administrative burdens. Customers can easily take advantage of cold tier storage in their existing workflows with our solution, experiencing no delays in retrieval and enjoying the same enterprise-class cloud storage solution.” — Jean Lozano, Chief Technology Officer at MediaValet.

“Nasuni has qualified and now fully supports the newly released Azure Blob Storage cold tier. Azure’s new cold tier, which supports online access to objects, will help joint customers generate significant cost savings while managing their files on Nasuni, which is built on the Azure Blob platform. Nasuni plans to leverage this cold tier of Blob Storage in pursuit of supporting customer migrations as part of their digital transformation journey.” — Russ Kennedy, Nasuni’s Chief Product Officer.

“Building on our existing storage integration options for Microsoft customers, Veeam is excited to announce support for Azure Blob cold tier storage in the next release of the Veeam Data Platform. Cold tier support provides instant access, while offering a cost-effective price point between other on-demand tiers like hot and cool, and the offline archive tier. As the importance of data protection and ransomware recovery continues to grow, we remain committed to providing our customers with robust solutions.” — Danny Allan, CTO at Veeam.

Getting started with cold tier

Learn more about all blob access tiers, including Hot, Cool, Cold, and Archive, in Blob access tier overview.

Understanding the prices on cold tier comparing with other access tiers in Blob storage pricing and Azure Data Lake Storage Gen2 pricing.

Learn more about regions here — Azure Product by Region.

Learn how to configure tiers on blobs for cold tier in Configure blob access tiers.

The post Efficiently store data with Azure Blob Storage Cold Tier — now generally available appeared first on Azure Blog.
Quelle: Azure

Cloud Cultures, Part 2: Global collaboration in Sweden

The outcomes of cloud adoption are shaped dramatically by the people and cultures that operate and innovate with the technology. The rapid pace of technological advancements we are seeing on a global scale is exciting, but if there is one thing that I love more than technology, it is the people, the stories, and the life experiences that influence how it’s used. These stories show firsthand how technology and tradition combine to form cloud cultures. In our first episode, we explored how the people in Poland are fearless when it comes time to act. They are a dynamic country embracing change, reinventing themselves, and creating new innovative opportunities. Conversations with our customers and community leaders in Poland gave me an important view of how much of an impact the history of Poland has on the present and future markets. In the second episode, see what I learned about cloud culture in Sweden.

Sweden: A global-first mindset

Our first Sweden datacenter region launched in 2021 and will grow to be one of our largest datacenter regions in Europe. My time in Sweden helped me understand why this region is growing so fast.

I learned firsthand how Swedes transform simple ideas into global successes. For Sweden, success knows no borders. This is a place that thinks beyond its own perimeter, because the market demands it, and success depends on it. Despite being one of the largest countries in Europe by landmass, it’s one of the smallest in population—which forces Sweden’s ambitious entrepreneurs to adopt a global-first mindset from day one. Collaborating with people around the world with different mindsets is one of the biggest challenges companies will face as they globally scale. It requires tapping into those diverse perspectives to create a better outcome. This is what drew me to Sweden. Their holistic approach to innovation has created an environment that fosters collaboration when scaling and enables them to thrive.

This focus on collaboration helped me better understand “fika”. While the term “fika” translates to “coffee”, in English, I learned in Sweden, fika is much more than that. Fika is an experience that does involve coffee and cookies but is more about the conversation and connection. It really focuses on the human power of collaboration. This idea of fika is woven into Sweden’s culture of innovation. It becomes clear that when shared beliefs underpin collaboration, the impact can be extraordinary.

Our conversations with customers and partners helped me see how the powerful winds of innovation that have converged with local customs, values, and ways of living, helped create something unique.

How are Swedish customers using the cloud

These conversations helped uncover the essence of Sweden’s digital transformation while exploring the country’s dynamic technology landscape. Below are just a few of the Swedish customers who are transforming their businesses to adapt to the growing needs of their customers in Sweden, and beyond:

Storekey is a Stockholm-based startup that is helping retail businesses flourish and meet the demands of an ever-evolving industry. Storekey is helping retailers by removing friction for the consumers and the retailers through an autonomous technology platform and the benefits of e-commerce to physical stores.

Handelsbanken is one of Sweden’s top banks, which provides universal banking services through a nationwide branch of networks in Sweden, Norway, Denmark, Finland, the Netherlands, and the United Kingdom, and is built on a strong emotional bond with their customers. They realized using cloud services is something that increased their capability for innovation, improved employee experiences, and created better interactions with their customers. This adoption of the latest cloud technology has helped Handelsbanken innovate, in a trusted way, in collaboration with their customers.

Swedbank is a multinational bank, based in Stockholm, who saw the flexibility and scalability of the cloud as a way to innovate by using AI and machine learning to enhance security measures to protect against criminal activities such as bank fraud.

Building a sustainable future

On my trip in Sweden, I sat down with Annika Ramsköld, the Chief Sustainability Office at Vattenfall, an energy company who is making waves with their commitment toward a fossil free future. Sustainability is not a trendy buzzword in Sweden, it is fundamental for Swedish organizations and their partners, and Vattenfall is very focused on holding their suppliers and partners, such as Microsoft, to their same sustainability requirements.

“It is our purpose. Everything we do, we want to help the entire society to be fossil free. That means every little piece of the supply chain, whether it is transport, or the way you extract materials, or the way you produce that material, should be fossil free and be done in a responsible way.”— Annika Ramsköld

I couldn’t agree more with Annika, as our own corporate commitments to be carbon negative, water positive, with zero waste, are echoing similar commitments by Vattenfall. Our partnership with Vattenfall has helped us make our Sweden Central datacenter region one of our most sustainable regions globally, and an example of how a partnership with a common vision can help us bring a supply of sustainable services to our customers.

The reach of cloud technology

Technology does not define people and culture but instead culture defines technology and how we use it. I learned in Sweden, their approach to collaboration, their approach to the fika, has shaped their usage of technology, bringing Swedish innovation to the entire world. I can’t wait for my next trip to learn even more.

Watch the Cloud Cultures: Sweden episode today.
The post Cloud Cultures, Part 2: Global collaboration in Sweden appeared first on Azure Blog.
Quelle: Azure