Pixma MX490, MX492, MB2010 und MG7520: Canon-Drucker stecken in Boot-Schleife fest
Offenbar macht ein Bug im DNS einige Canon-Drucker unbrauchbar. Die Community hat eine Lösung gefunden, Canon arbeitet noch an einer. (Drucker, DNS)
Quelle: Golem
Offenbar macht ein Bug im DNS einige Canon-Drucker unbrauchbar. Die Community hat eine Lösung gefunden, Canon arbeitet noch an einer. (Drucker, DNS)
Quelle: Golem
Eine ungepatchte Sicherheitslücke in Confluence ermöglicht es Angreifern, Code auf dem Server auszuführen. Sie wird bereits aktiv ausgenutzt. (Atlassian, Server)
Quelle: Golem
Mit der Ausnutzung von Sicherheitslücken in Intels Firmware lassen sich Angriffe sehr lang verstecken und Gegenmaßnahmen sabotieren. (Ransomware, Server)
Quelle: Golem
Founded in 2012, SingleStoreDB is a distributed SQL, cloud-native database that offers ultra-fast, low-latency access to large datasets — simplifying the development of modern enterprise applications. And by combining transactional (OLTP) and analytical (OLAP) workloads, SingleStoreDB introduces new efficiencies into data architecture. We are a multicloud database, meaning that our customers can launch SingleStoreDB on any cloud infrastructure they choose. At the same time, our alignment with Google Cloud has been foundational to our success.The following image showcases SingleStoreDB’s patentedthree-tier storage architecture — memory, local disk and object storage. This architecture enables fast inserts, large volumes of data storage, and overall better scalability and TCO.Let me tell you about how Google Cloud Marketplace has helped us grow our business from a startup in 2011 to one that 12 Fortune 50 companies and half of the top 10 banks in the U.S. rely upon today.Filling a critical data niche to accelerate business valueFirst, let’s talk about why companies choose to use SingleStoreDB alongside Google Cloud and other IT solutions.SingleStoreDB on Google Cloud is ideal for data-intensive applications with these five key requirements: Query latency — Execute and receive query results with millisecond latencies secondConcurrency — Support a large number of users or concurrent queries, without sacrificing latencyQuery complexity — Handle both simple and complex queries Data size — Effortlessly operate over large data setsData ingest speed — Load (or ingest) data at very high rates, from thousands to millions of rows per secondEvery customer approaches IT differently. Many are either taking a multicloud or hybrid approach to their infrastructure. By running on Google Cloud, SingleStoreDB helps customers manage workloads across all clouds and on-prem systems.We find that many of our customers use SingleStoreDB as a single-pane view into all their data and infrastructure, as well as for data-intensive applications. Because SingleStoreDB is an analytical and transactional database, it shines when handling workloads that require outstanding accuracy and speed.Especially when it comes to our larger customers, companies increasingly run SingleStoreDB on Google Cloud thanks to the Global Network Backbone. Because Google Cloud owns its global fiber network, we tap into Google’s secure-by-design infrastructure to protect information, identities, applications, and devices. This, alongside the increasing popularity of Google Cloud AI/ML solutions, makes SingleStore on Google Cloud a growing force.Success on Google Cloud MarketplaceWhile the partnership between SingleStoreDB and Google Cloud solutions is paramount to our success, the availability of our solution on Google Cloud Marketplace amplifies customer satisfaction.Google Cloud Marketplace simply makes it easier for partners to sell into Google Cloud customers. Here are some of the advantages we’ve seen since launching on Google Cloud Marketplace:Our customers can easily transact and deploy via Marketplace, providing that true Software-as-a-Service feel for SingleStoreDB.Transacting through Marketplace counts toward our customers’ Google Cloud commit, which helps partners like us tap into budget that customers are already setting aside for Google Cloud spend.Billing and invoicing are intuitive and efficient, as everything is automated through Google Cloud Marketplace. Our customers receive a single, clear invoice from Google covering SingleStoreDB, Google Cloud, and other partner solutions purchased from Marketplace.Many of our customers have previously signed the Google Cloud User License Agreement (ULA), which further expedites procurement by speeding up legal reviews.Transforming how we sellAll of these advantages translate to a powerful change in our sales strategy. We no longer have to get bogged down by numbers and contracts. Instead, we can focus on educating customers about how the use of our solutions on Google Cloud will deliver returns to their business. Co-selling with Google Cloud has benefited our business significantly and provides customers with the best possible experiences.We have also seen surprising advantages stem from our success on Google Cloud Marketplace. Our product and engineering teams are much more engaged with Google Cloud technologies now, putting a heavy emphasis on new integrations. We’re actively exploring new Dataflow templates, native connectors for BigQuery and Data Studio, and other solutions.What SingleStoreDB customers are sayingTo give you a better idea of how our customers are benefiting, here are some recent comments about running SingleStoreDB on Google Cloud and transacting through Google Cloud Marketplace:“Our goal has and will always be to build our platform in a way that makes it feel like an on-premise solution. Speed of data processing and delivering are critical to providing this within our solutions. Both Google Cloud and SingleStore have helped us achieve this.” — Benjamin Rowe, Cloud & Security Architect, Arcules“The SingleStore solution on Google Cloud allows marketing teams using Factors.ai to make more informed data decisions by organizing data in one system and allowing for self-serve analytics.” — Praveen Das, Co-Founder, Factors.aiAt the end of the day, our customer experiences and the business impacts they achieve with our solutions are our most critical KPIs. By partnering with Google Cloud, we have unlimited potential to improve our services to power the next generation of data and analytics applications.Discover how SingleStore can transform your business on the Google Cloud Marketplace.Related ArticleFour ways Google Cloud Marketplace simplifies buying cloud softwareLearn about new features in Google Cloud Marketplace that make buying and selling better than ever.Read Article
Quelle: Google Cloud Platform
Google Cloud VMware Engine (GCVE) allows a user to deploy a managed VMware environment within an Enterprise Cloud Solution. We’ve put together a new white paper, “Google Cloud VMware Engine Performance Migration & Benchmarks,” to help our customers better understand the architecture, its performance, and the benefits. If you’re not familiar with Google Cloud VMware Engine yet, let’s talk a bit more about it.Utilizing Google Cloud lets you access existing services and cloud capabilities; one of those services and solutions mentioned within this document is our Hybrid Cloud Extension, also known as HCX. HCX provides you with an easier transition from on-prem to the cloud, allowing systems administrators to quickly deploy a private cloud and scale their needed Virtual Machines accordingly. The proposed referenced solution is well suited for organizations looking to begin their cloud migration journey and understand the technical requirements within the process without having to be fully committed to their cloud strategy or evacuation data center strategy.Currently, many organizations are navigating their way through their current IT challenges and cloud solutions. Google Cloud VMware Engine provides you the “easy on-ramp” to migrate your workloads into the cloud. You don’t have to move everything to the cloud at once, though, because GCVE provides the option to scale your IT infrastructure from on-prem to the cloud at your discretion by leveraging HCX. HCX also lets you migrate a virtual machine from on-premise to the cloud via a VPN or internet connection without any additional downtime or having to save their work and log off of their machine. With GCVE, you can continue to work during your business hours and operations while your systems administrators migrate your teams to the cloud without the downtime associated with virtual machine migration.The ability to migrate a virtual machine from on-prem to the cloud raises another question: how fast can a targeted virtual machine migrate to the cloud? Google analyzed this specific scenario, assessing what the requirements were to migrate an on-prem virtual machine to the cloud via a Virtual Private Network (VPN), and then analyzing how fast that connection was established and transmitted through HCX. The answer to that question—and more—is all contained within our brand new white paper, “Google Cloud VMware Engine Performance Migration & Benchmarks,“ which you can download now. And if you’re ready to get started with your migration efforts, sign up for a free discovery and assessment with our migration experts.Related ArticleHow Google Cloud and partners can accelerate your migration successLearn more about our updates to RAMP, our holistic, end-to-end migration program to help customers simplify and accelerate their path to …Read Article
Quelle: Google Cloud Platform
Organizations need a cloud platform that can securely scale from on-premises, to edge, to cloud while remaining open to change, choice, and customization. They must be able to run their applications wherever they need, on infrastructure optimized to process a very high volume of data with minimal delay, all while maintaining the satisfaction and stability of their ML-driven user experiences. At Google, we deeply understand these customer requirements, which is why we launched Google Distributed Cloud last year. With Google Distributed Cloud, we bring a portfolio of fully managed solutions to extend our infrastructure to the edge and into customers’ own data centers. Today, Google Cloud customers love our artificial intelligence (AI) services for building, deploying and scaling more effective AI models with developers successfully using our core machine learning (ML) services to build and train high-quality custom ML models. Our customers are also using a variety of our managed database solutions because of their simplicity and reliability. However, some customers have highly sensitive workloads and desire to use their own private, dedicated facilities. For these customers, we’re excited to announce they will be able to run a selection of these same AI, ML, and database services in Google Distributed Cloud Hosted inside their own data centers within the next year. With this announcement, customers get to take advantage of Anthos, a common management control plane which provides a consistent development and operations experience across hybrid environments. This same experience is now available for on premise environments.Our portfolio of AI, ML, and database products enable customers to quickly deploy services with out-of-box simplicity that includes delivering valuable insights through both unstructured and structured data. The integration of our Google Cloud AI and database solutions into the Google Distributed Cloud portfolio means the ability to harness real-time data insights like never before due to the proximity to where the data is being generated and consumed. This includes ensuring low latency to support applications that are mission critical to businesses such as computer vision, which can be used on the factory floor to detect flaws in products or to index large amounts of video. The addition of these transformative capabilities allow customers to save money, innovate faster and provide the greatest flexibility and choice. With this integration, customers using Google Distributed Cloud Hosted will have access to some exciting AI features. One example is our Translation API that can instantly translate texts in more than one hundred languages. Translation API is a feature available in Vertex AI, our managed ML platform that is generally available and allows companies to accelerate the deployment and maintenance of AI models. With this announcement, customers who need to run highly sensitive workloads in an on-premise or edge environment can now leverage the unique functionality of Translation API along with other Google Cloud pre-trained API’s in Vertex AI such as Speech-to-Text and optical character recognition (OCR). These features were all trained on our planet-scale infrastructure to deliver the highest level of performance, and as always, all of our new AI products also adhere to our AI Principles. Additionally, by incorporating our managed databases offering onto the Google Distributed Cloud portfolio, customers can process data locally to migrate or modernize applications, opening up more time for innovation and to create value in their applications. This is especially true in industries like financial services and healthcare where there are compliance requirements on where data can reside. With these new AI, ML and databases products available in our Google Distributed Cloud portfolio, customers will still have full authority to maintain autonomy and control over their own data centers, yet can rely on Google for the latest technology innovations in cloud services. For more information, please visit Google Distributed Cloud, and to learn more about Vertex AI specifically, join us at our Applied ML Summit.
Quelle: Google Cloud Platform
May was another big month for us, even as we get ready for more industry work and engagement at the RSA Security Conference in San Francisco. At our Security Summit and throughout the past month, we continued to launch new security products and features, and increased service and support for all our Google Cloud and Google Workspace customers. Google Cloud’s Security Summit 2022Our second annual Security Summit held on May 17 was a great success. In the days leading up to the Summit, we discussed how we are working to bring Zero Trust policies to government agencies, and we revealed our partnership with AMD to further advance Confidential Computing – including an in-depth review focused on the implementation of the AMD secure processor in the third generation AMD EPYC processor family. We also introduced the latest advancements in our portfolio of security solutions. These include our new Assured Open Source Software service (Assured OSS), which enables enterprise and public sector users of open source software to incorporate the same OSS packages that Google uses into our own developer workflows; extending Autonomic Security Operations (ASO) to the U.S. public sector, a solution framework to modernize cybersecurity analytics and threat management that’s aligned with the Zero Trust and supply-chain security objectives of 2021’s cybersecurity Executive Order and the Office of Management and Budget memorandum; expanding our compliance with government software standards; and SAML support for Workload Identity Federation, so that customers can use a SAML-based identity provider to reduce their use of long-lived service account keys. Advancing open source software securityWe continued to partner with the Open Source Security Foundation (OpenSSF,) the Linux Foundation, and other organizations at another industry open source security summit to further develop the initiatives discussed during January’s White House Summit on Open Source Security. We’re working towards the goal of making sure that every open source developer has effortless access to end-to-end security by default. As covered in our Security Summit, an important part of this effort is Assured OSS, which leverages Google’s extensive security experience and can help organizations reduce their need to develop, maintain, and operate complex processes to secure their open source dependencies. Assured OSS is expected to enter Preview in Q3 2022.Also, as part of our commitment to improving software supply-chain security, the Open Source Insights project helps developers better understand the structure and security of the software they use. We introduced Open Source Insights data in BigQuery in May so that anyone can use Google Cloud BigQuery to explore and analyze the dependencies, advisories, ownership, license and other metadata of open-source packages across supported ecosystems, and how this metadata has changed over time. Why Confidential Computing and our partnership with AMD mattersI’d like to take a moment to share a bit more on the importance of Confidential Computing and our partnership with AMD. I’ve been talking a lot this year about why we as an industry need to evolve our understanding of shared responsibility into shared fate. The former assigns responsibilities to either the cloud provider or the cloud provider’s customer, but shared fate is a more resilient cybersecurity mindset. It’s a closer partnership between cloud provider and customer that emphasizes secure-by-default configurations, secure blueprints and policy hierarchies, consistently available advanced security features, high assurance attestation of controls, and insurance partnerships.In our collaboration with AMD, we focused on how secure isolation has always been critical to our cloud infrastructure, and how Confidential Computing cryptographically reinforces that secure isolation. AMD’s firmware and product security teams, Google Project Zero, and the Google Cloud Security team collaborated for several months to analyze the technologies and firmware that AMD contributes to Google Cloud’s Confidential Computing services. Also in May, we expanded the availability of Confidential Computing to include N2D and C2D Virtual Machines, which run on third-generation AMD EPYC™ processors.GCAT HighlightsHere are the latest updates, products, services and resources from our cloud security teams this month: SecurityPSP protocol now open source: In order to better scale the security we offer our customers, we created a new cryptographic offload protocol for internal use that we open sourced in May. Intentionally designed to meet the requirements of large-scale data-center traffic, the PSP Protocol is a TLS-like protocol that is transport-independent, enables per-connection security, and is offload-friendly. Updating Siemplify SOAR: The future of security teams is heading towards “anywhere operations,” and the latest version of Siemplify SOAR can help get us there. It gives organizations the building blocks needed across cloud infrastructure, automation, collaboration, and analytics to accelerate processes for more timely responses and automated workflows. In turn, this can free up teams to focus on more strategic work.Guardrails and governance for Terraform: Popular open-source Infrastructure-as-Code tool Terraform can increase agility and reduce errors by automating the deployment of infrastructure and services that are used together to deliver applications. Our new tool verifies Terraform and can help reduce misconfigurations of Google Cloud resources that violate any of your organization’s policies. Benchmarking Container-Optimized OS: As part of our security-first approach to safeguarding customer data while also making it more scalable, we want to make sure that our Container-Optimized OS is in line with industry-standard best practices. To this end, the Google Cloud Security team has released a new CIS benchmark that clarifies and codifies the security measures we have been using, and makes recommendations for hardening. New reCAPTCHA Enterprise guidebook: Identifying when a fraudster is on the other end of the computer is a complex endeavor. Our new reCAPTCHA Enterprise guidebook helps organizations identify a broad range of online fraud and strengthen their website security.Take the State of DevOps 2022 survey: The State of DevOps report by Google Cloud and the DORA research team is the largest and longest running research of its kind, with inputs from more than 32,000 professionals worldwide. This year will focus on how security practices and capabilities predict overall software delivery and operations performance, so be sure to share your thoughts with us.Industry updatesSecurity improvements to Google Workspace: I wrote at the beginning of the year that data sovereignty is one of the major, driving megatrends shaping our industry today. At the beginning of May we announced Sovereign Controls for Google Workspace, which can provide digital sovereignty capabilities for organizations, both in the public and private sector, to control, limit, and monitor transfers of data to and from the EU starting at the end of 2022, with additional capabilities delivered throughout 2023. This commitment builds on our existing Client-side encryption, Data regions, and Access Controls capabilities. We are also extending Chrome’s Security Insights to Google Cloud and Google Workspace products, as part of our efforts to consistently provide advanced features to our customers. Can you hear the security now? Pindrop is joining forces with Google Cloud. If you’ve never heard of Pindrop, you’ve almost certainly encountered their technology, which is used to authenticate payments, place restaurant and shopping orders, and check financial accounts over the phone. Their technology provides the backbone for anti-fraud efforts in voice-based controls, as well. With Google Cloud, Pindrop can be better able to detect deep fakes and robocalls, help banks authenticate transactions, and provide retailers with secure AI-powered call center support.Compliance & Controls Expanding public sector and government compliance: Google Cloud is committed to providing government agencies with the security capabilities they need to achieve their missions. In addition to our aforementioned Autonomic Security Operations and new Assured Open Source Software (OSS) service, we’re expanding Assured Workloads. This can help enable regulated workloads to run securely at scale in Google Cloud’s infrastructure. We are also pleased to announce that 14 new Google Cloud services support FedRAMP Moderate and three services are being added to support FedRAMP High, with more coming this summer. (You can read the full list of those services at the end of this blog.) Next month we’ll recap highlights from the RSA Conference and much more. To have our Cloud CISO Perspectives post delivered every month to your inbox, sign up for our newsletter. We’ll be back next month with more security-related updates.Related ArticleCloud CISO Perspectives: April 2022Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.Read Article
Quelle: Google Cloud Platform
We know we are in the middle of a transformative time. At the heart of this transformation is the digitization of data. Data is the most strategic asset for organizations across all industries, and a new level of data agility is required to deal with dynamic changes in our world. Organizations that have embraced their data as a strategic asset are at a competitive advantage. In fact, a recent Gartner® Predict report estimates that 90 percent of data management tools and platforms that fail to support multicloud and hybrid capabilities will be set for decommissioning through 2026.1
But for all the powerful capabilities the cloud offers to help organizations on their transformation journey, not all data is created equal. The myriad of regulations to navigate, the need for data sovereignty, the low tolerance for any form of disruption keeps data from living in a single public cloud, expanding your data estate and complexity. Managing the vast amounts of data that exists across siloed, disparate systems, applications, and locations, while also getting the most from existing investments is not a balance many solutions can achieve.
This is why, back in November 2019, we debuted Azure Arc, a set of technologies that extends Azure innovations and cloud benefits to any infrastructure. Cloud-native databases like Azure SQL and PostgreSQL have been enabled by Azure Arc, delivering the much-needed consistency and cloud automation for all data workloads. We want you to focus less on managing data, and more on creating value and unlocking insights. Because true innovation starts at the data layer.
Innovate anywhere with Azure Arc
With Azure Arc-enabled data services, we can bring cloud data management to any infrastructure, across customer on-premises data centers, other third-party clouds, and the edge. Here’s how Azure Arc is delivering on that promise:
Always up to date with full automation. Benefit from an evergreen SQL with the latest features using automated updates so there is no more end-of-support. Those in-place, rolling updates ensure close to zero downtime, so you can maximize efficiencies and minimize disruptions
Get industry-leading, multi-layered security with built-in capabilities. With comprehensive encryption including Transparent Data Encryption and Always Encrypted, as well as Azure Role-Based Access Control and Policy, your data is protected both at the powerful database engine-level and by Azure Security capabilities from the cloud.
With elastic scale, you can scale up or down based on your resource needs without application downtime to optimize performance. You can also realize cost efficiencies by paying only for what you use without the need to overprovision.
Deliver a vastly simplified DevOps experience through full automation with built-in capabilities like rapid deploy, high availability, and disaster recovery. You can deploy a 3 replica SQL MI with full high availability (HA) in two minutes with a single command, and gain a unified view into your query performance, storage capacity, and error logs using dashboards directly from the built-in monitoring. Use the tools you are already familiar with.
All these capabilities can run in any environment regardless of the connectivity to Azure. You can run Arc-enabled data services without a connection to Azure. If you can be fully connected to Azure, the user experience is richer and in real-time, but it is not required, data services can keep running even without a connection to Azure. Azure portal deployment and other value-added management services are fully integrated under direct connection.
Built for mission-critical
And now, we’re making that solution available in an even more powerful way. We are excited to announce the general availability of Azure Arc-enabled data services Business Critical (BC) tier, designed to support the most demanding mission-critical data workloads.
With feature parity with SQL Server Enterprise Edition, it delivers all the proven capabilities customers have trusted for decades. It runs online transaction processing (OLTP) and hybrid transaction/analytics processing (HTAP) with record-setting performance, advanced high availability, and top-rated security.
It meets the most demanding business continuity requirements using Always On Availability Groups, so your app will have close to zero downtime in case of an automated failover.
Failover to another instance within the same Kubernetes cluster for local high availability, or to a different cluster in a different datacenter or even a public cloud, delivering “cloud-level redundancy”.
We also provide a free passive instance to run in your disaster recovery for even greater value
Choose the configuration that best suits your workloads, with no set limits on CPU and memory configuration. To further maximize the performance, we provide one free read-scale replica to offload any read-heavy workloads.
Our partner, Dell Technologies, conducted a series of OLTP benchmarks, using Intel technology, which took a closer look at the kind of performance possible with this new service tier. The results were remarkable. Arc-enabled SQL Managed Instance was proven to provide the same performance as SQL Server on Windows Server, so customers can run their workloads with confidence. The speed of provisioning and deprovisioning will massively speed up your continuous integration (CI) test runs. SQL instances can now be deployed very easily via automation and be available in 60 seconds, and Business Critical cuts multi-replica HA deployment time from hours or days to minutes with a single command! Those out-of-the-box experiences allow you to realize time efficiencies and redirect resources to where they matter most.
Broad partner ecosystem
Our ambition to help you digitally transform your business with the cloud and edge is boundless, we know we can’t do it alone. No single cloud provider can deliver all the infrastructure and as-a-service solutions you’ll need. That’s why we’re building an ecosystem of partners across service providers, platform providers including OS and Container platforms, and independent software vendors (ISVs) to help you envision, plan, and deploy the full stack of hybrid and multicloud solutions. Our history in both productivity and the datacenter is unique among cloud providers. Microsoft is at our best when our platforms fuel the growth of others, and I’m thrilled to see how energized the ecosystem is to evolve with us.
Get started
Review Azure Arc documentation on data services and trial the new Business Critical tier in the Azure portal.
Sign up for the Azure Hybrid, Multicloud, and Edge Day digital event to learn about the latest innovation from Azure Arc.
12021 Gartner, Predicts 2022: Data Management Solutions Embrace Automation and Unification
GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.
Quelle: Azure
The global events of the last couple of years have introduced significant changes to how companies operate and the way we work, accelerating digital transformation for many as they seek the additional flexibility, scale, and cost savings of the cloud. More companies are choosing Azure SQL for their SQL Server workloads and it’s easy to see why. Azure SQL provides a full range of deployment options ranging from edge to cloud and a consistent unified experience that makes the most of your on-premises skills and experience. It’s very cost-effective, too, when you use the Azure Hybrid Benefit to maximize your on-premises licensing investments.
Customers moving SQL Server workloads to the cloud have choices. Whether simply migrating to virtual machines (VMs) to offload infrastructure costs or modernizing on fully-managed database services that do more on your behalf, every choice on Azure is a great one.
SQL Managed Instance leads in price-performance for mission-critical workloads
More companies are choosing managed services because of the benefits they provide like lower total cost of ownership, productivity gains, and even accelerated time to market.
According to a study by Enterprise Strategy Group, customers who migrated their SQL Server data from on-premises to Azure Virtual Machines reduced their costs by up to 47 percent. Customers who continued to modernize their data from virtual machines to managed database services realized an additional 17 percent cost savings and an expected incremental $30M in new revenue due to faster release cycles. When it comes to modernizing your SQL Server data at scale, Azure SQL Managed Instance is your best choice.
It’s also a price-performance leader. Principled Technologies, an independent research firm, recently published a study where they benchmarked SQL Managed Instance and SQL Server on Amazon Web Services (AWS) RDS across three different workloads. SQL Managed Instance emerged as the leader across each of these workloads, with up to five times faster performance while costing up to 93 percent less than AWS RDS.1
SQL Managed Instance combines the broadest SQL Server engine compatibility back to SQL Server 2008, with all the benefits of a fully managed and always up-to-date platform-as-a-service (PaaS). You can use it to quickly and confidently modernize your custom and vendor-provided apps to Azure and further unlock the benefits of Azure’s integrated service platform. When a hybrid approach is required, you can run Azure Arc-enabled SQL Managed Instance on the infrastructure of your choice.
“We wanted an easy transition from on-premises SQL Server, and Azure SQL Managed Instance looks just like SQL Server—with all the operational benefits of a platform as a service.”—Hardayal Singh, Senior Principal Enterprise Architect, City National Rochdale
SQL Server on Azure Virtual Machines leads SQL Server on AWS EC2
SQL Server on Azure Virtual Machines saves money and simplifies management of security and high availability at no additional cost. SQL Server on Azure Virtual Machines not only saves money, is flexible at scale, meets peaks in demands, and accelerates innovation, but leads SQL Server on Amazon Web Services (AWS) Elastic Cloud Compute (EC2) in overall speed and price performance.
In a recent study from GigaOm, performance was tested between Microsoft SQL Server on Azure Virtual Machines versus SQL Server on AWS EC2 instances across transactional and analytical workloads. Microsoft SQL Server came out on top meeting customers’ mission-critical requirements with up to 42 percent faster transactional performance while costing up to 31 percent less than AWS EC2.2
With SQL Server on Azure Virtual Machines you can shift your SQL workloads with ease and maintain 100 percent SQL Server compatibility and operating system-level access. In addition, when you register your self-installed VMs with SQL Server IaaS Agent extension, you can save money and simplify management of security, high availability, and storage administration. With SQL Server IaaS Agent extension, you get built-in security and management benefits, including automated backups, for free and pay only for what you use by converting licenses between pay-as-you-go, Azure Hybrid Benefit, and HA/DR license types.
H&R Block made plans to stay ahead of the changes, providing seamless multichannel experiences and unifying its disparate data sources to better serve its customers. By moving its various workloads to Microsoft SQL Server the tax provider has been able to enhance service delivery, scale to meet its peaks in demand, and accelerate innovation. The company was able to move its customer-facing systems—including a DIY online tax-filing app and an appointment application used by 10,000 offices—to SQL Server on Azure Virtual Machines with minimal issues before, during, and after migration.
“Investing in Azure data services and data platforms has really set up an amazing foundation for us to continue to deliver and accelerate new products and services to our clients.”—Aditya Thadani, Vice President of Architecture and Information Management, H&R Block
Get started today
Read the reports from Principled Technologies and GigaOm.
Ready for that next step? Use Azure Migrate to assess your data’s readiness for the cloud and use the provided tools to start your journey today.
1Price-performance claims based on data from a study commissioned by Microsoft and conducted by Principled Technologies in April 2022. The study compared performance and price performance between a 16 vCore, 64 vCore and 80 vCore Azure SQL Managed Instance using premium-series hardware on the business-critical service tier and the db.m6i.32xlarge, db.r5b.4xlarge and db.r5b.16xlarge offerings for Amazon Web Services Relational Database Service (AWS RDS) on SQL Server. Benchmark data is taken from a Principled Technologies report using recognized standards, HammerDB TPROC-C, HammerDB TPROC-H and Microsoft MSOLTPE, a workload derived from TPC-E. The MSOLTPE is derived from the TPC-E benchmark and as such is not comparable to published TPC-E results, as MSOLTPE results do not comply with the TPC-E Specification. The results are based on a mixture of read-only and update intensive transactions that simulate activities found in complex OLTP and analytics application environments. Price-performance is calculated by Principled Technologies as the cost of running the cloud platform continuously divided by transactions per minute or per second throughput, based upon the standard. Prices are based on publicly available US pricing in South Central US for Azure SQL Managed Instance and US East for AWS RDS as of April 2022 and incorporates Azure Hybrid Benefit for SQL Server, excluding Software Assurance and support costs. Performance and price-performance results are based upon the configurations detailed in the Principled Technologies report. Actual results and prices may vary based on configuration and region.
2Price-performance claims based on data from a study commissioned by Microsoft and conducted by GigaOm in April 2022. The study compared price performance between SQL Server 2019 Enterprise Edition on Windows Server 2022 in Azure E32bds_v5 instance type with P30 Premium SSD disks and SQL Server 2019 Enterprise Edition on Windows Server 2022 in Amazon Web Services Elastic Cloud Compute instance type r5b.8xlarge with General Purpose (gp3) volumes. Benchmark data is taken from a GigaOm Transactional Field Test derived from a recognized industry standard, TPC Benchmark™ E (TPC-E) The Field Test does not implement the full TPC-E and as such is not comparable to any published TPC-E benchmarks. Prices are based on publicly available US pricing in North Central US for SQL Server on Azure Virtual Machines and Oregon for AWS EC2 as of April 2022. The pricing incorporates three-year reservations for Azure and AWS compute pricing, and Azure Hybrid Benefit for SQL Server and Windows Server, and License Mobility for SQL Server in AWS, excluding Software Assurance and support costs. Actual results and prices may vary based on configuration and region.
Quelle: Azure
Was am 2. Juni 2022 neben den großen Meldungen sonst noch passiert ist, in aller Kürze. (Kurznews, Google)
Quelle: Golem