Pub/Sub stores messages reliably at any scale. Publish a log entry or an event and you don’t have to worry about when it is processed. If your subscribers, event handlers, or consumers are not keeping up, we’ll hold the messages till they are ready. Bugs made it past your integration test? Not to worry: just seek back and replay. But, you had to worry a little: until today, you had up to a week to fix the code and process everything. And if you wanted to use historical data to compute some aggregate state, such as a search index, you had to use a separate storage system. In fact, we noticed that many of our users stored raw copies of all messages in GCS or BigQuery, just in case. This is reliable and inexpensive, but requires a separate reprocessing setup in case you actually need to look at older data. Starting today, Pub/Sub can store your messages in a topic for up to 31 days. This gives you more time to debug subscribers. This also gives you a longer time horizon of events to backtest streaming applications or initialize state in applications that compute state from an event log. Using the feature is simple. The interfaces and pricing are unchanged. You can just set a larger value for a topic’s message retention duration. For example, you can configure extended retention of an existing topic using the gCloud CLI: gcloud pubsub topics update myTopic –message-retention-duration 31dOr use the settings in the Topic Details page in Cloud Console:One limitation of this feature is that you cannot extend storage retention for an individual subscription beyond 7 days. This limits the control individual subscription owners have over storage. The limit comes with benefits: controlling storage costs is simpler and so is limiting access to older data across multiple applications.We’d love to hear how you’ve used this feature or how it came short of your needs. Let us know by posting a message to the pubsub-discuss mailing list or creating a bug.
Quelle: Google Cloud Platform
Published by