ARC Region switch adds Lambda event source mapping execution block for event handling during failover

Amazon Application Recovery Controller (ARC) Region Switch helps customers orchestrate the failover of their multi-Region applications to achieve a bounded recovery time in the event of a Regional impairment. Today, we are announcing the Lambda event source mapping execution block, which automates the coordinated failover of event streams for multi-Region workloads. Customers running event-driven architectures use Lambda functions with event source mappings to process event streams from Kinesis, DynamoDB Streams, MSK, or SQS. For active-passive workloads, customers may maintain Lambda functions in each Region but process events in only one Region at a time. These event source mappings must be toggled during failover to avoid duplicate processing—a manual, error-prone step. The Lambda event source mapping execution block automates this by enabling or disabling event source mappings in either the activating or deactivating Region. To control duplicate processing, customers can configure two Lambda event source mapping execution blocks in sequence: a disable block to stop event processing in the deactivating Region, and an enable block to start it in the activating Region. The disable block can be overridden by running the plan in “ungraceful” mode for unplanned failovers where the deactivating Region may be impaired. Native cross-account support enables a single plan to handle event stream failover across multiple accounts. To get started, see the Lambda event source mapping execution block documentation. ARC Region switch is available in all commercial Regions. See ARC Region switch availability
Quelle: aws.amazon.com

Amazon Aurora DSQL now supports change data capture (Preview)

Amazon Aurora DSQL introduces support for change data capture (CDC) in preview, enabling you to stream real-time database changes directly to Amazon Kinesis Data Streams. This fully managed capability removes the need to build or maintain custom streaming pipelines, making it easier to build event-driven applications, power real-time analytics pipelines, and synchronize data across systems. Aurora DSQL automatically captures the result of insert, update, and delete operations as change events. You can use these events to synchronize data across microservices, trigger downstream processing with AWS Lambda, or deliver to Amazon S3, Amazon Redshift, and Amazon OpenSearch Service through Amazon Data Firehose for analytics. CDC streaming requires no infrastructure setup and is designed to have zero impact on your database workload, so you can stream changes without affecting database throughput or latency. CDC streaming in preview is available in all AWS Regions where Aurora DSQL is available. Streams are billed using Distributed Processing Units (DPUs) based on the volume of data captured, with standard Amazon Kinesis Data Streams pricing applying separately. To learn more, read the blog and see getting started.
Quelle: aws.amazon.com

AWS Transform agents now available in Kiro, Claude, Cursor, and Codex

Today, AWS announces that the AWS Transform agents — built on decades of AWS migration and modernization experience — are now accessible through a Kiro power, agent plugins, and via the AWS Transform MCP server. Developers can now consume all of AWS Transform’s capabilities directly from their preferred development environment, whether working interactively in an agentic IDE, managing jobs through the web console, or integrating programmatically via MCP.
This launch gives builders flexibility to choose the surface that fits their workflow while gaining the depth of transformation expertise behind the AWS Transform agents for Windows, VMware, mainframe and more. A developer can start a transformation in their agentic IDE, monitor progress and collaborate in the web console, then see results back in their IDE — all against the same underlying job with consistent state. Additionally, AWS Transform now supports IAM role authentication. Customers who start using AWS Transform in their IDE or the web app can use their existing AWS credentials to create a Transform environment, workspace, and transformation job.
The agent plugin and MCP are available on GitHub, and the Kiro Power within the Kiro marketplace. To learn more, see https://aws.amazon.com/transform.
Quelle: aws.amazon.com

AWS Transform introduces the agent builder toolkit Kiro power for building customized transformation agents

Today, as part of the AWS Transform composability initiative, AWS announces the general availability of the agent builder toolkit Kiro power for AWS Transform. With the agent builder toolkit, AWS Partners and customers can build agents tailored to their specific modernization needs and ensure it works seamlessly within AWS Transform.
This capability enables Migration and Modernization Competency Partners, ISVs, or customers to create differentiated transformation solutions by integrating their specialized agents, tools, knowledge bases, and workflows with AWS Transform’s agentic AI capabilities. The agent builder toolkit provides the end-to-end lifecycle for transformation agents: build agents using the Kiro power; share them with teams or across partner networks, and register them with AWS Transform for discovery. The agent builder toolkit for AWS Transform is available in the Kiro power marketplace. To learn more, see AWS Transform (https://aws.amazon.com/transform).
Quelle: aws.amazon.com

SageMaker AI now supports serverless model customization for Qwen3.6

Amazon SageMaker AI now supports serverless model customization for Qwen3.6 27B parameter model using supervised fine-tuning (SFT) and reinforcement fine-tuning (RFT). Qwen3.6 is a popular open-weight model family from Alibaba Cloud. This launch is an addition to our support for fine-tuning Qwen3.5 and other popular models. Before this launch, you could deploy Qwen3.6 base model on SageMaker AI and now, you can also adapt it to your specific domains and workflows. Model customization enables you to tailor foundation models with your proprietary data so they more accurately reflect your domain knowledge, terminology, and quality standards. Rather than building models from scratch, fine-tuning lets you start from a capable base model and specialize it for your use cases, whether that’s improving accuracy on domain-specific tasks, aligning outputs with your organization’s tone, or improving performance on new tasks using your labeled data. With serverless customization, SageMaker AI handles all infrastructure provisioning and training orchestration, so you can focus on your data and evaluation rather than cluster management, and only pay for what you use. Serverless model customization for Qwen3.6 on SageMaker AI is available in US East (N. Virginia), US West (Oregon), Asia Pacific (Tokyo), and EU (Ireland). To get started, navigate to the Models page in Amazon SageMaker Studio to launch a customization job, or use the SageMaker Python SDK for programmatic access. To learn more, see the Amazon SageMaker AI model customization documentation.
Quelle: aws.amazon.com