#mydockerbday Recap + Community Stories

Emma Cresta, 13

Although March has come and gone, you can still take part in the awesome activities put together by the community to celebrate Docker’s 7th birthday. 

Birthday Challenge

Denise Rey and Captains Łukasz Lach, Marcos Nils, Elton Stoneman, Nicholas Dille, and Brandon Mitchell put together an amazing birthday challenge for the community to complete and it is still available. If you haven’t checked out the hands-on learning content yet, go to the birthday page and earn your seven badges (and don’t forget to share them on twitter).

Live Show

Captain Bret Fisher hosted a 3-hour live Birthday Show with the Docker team and Captains. You can check out the whole thing on Docker’s Youtube Channel, or skip ahead using the timestamps below:

– 02:00 Pre-show pics and games

– 07:43 Kickoff with Captains

– 29:00 Docker Roadmap

– 1:15:47 Docker Desktop: What’s New

– 1:53:45 Docker Hub with GitHub Actions

– 2:20:15 Using Docker with Kubernetes

– 2:55:00 #myDockerBday Stories

Community Stories

And while many Community Leaders had to cancel in-person meetups due to the evolving COVID 19 situation, they and their communities still showed up and shared their #mydockerbday stories. There were too many amazing stories to include in one blog post, so I’ve shared just a few of my favorites here: 

Joining a microservice-based architecture team was already going to be a steep learning curve. That would have been the case if it wasn’t for Docker. Learning and using Docker was a very pleasant experience and has improved my day-to-day developer experience because it makes everything easy, especially on projects that span multiple services. I am truly grateful for this product.

Gerade Geldenhuys, Engineer

I first stumbled upon Docker at a conference and I have been a big fan ever since. The concept of a container and the ease of using it was great. I would actively attend meetups on Docker and also hosted a Docker meetup along with co-workers. I have also made a few OSS contributions to Docker and had fun learning Golang in the process. Docker fascinates me today as much as it did 7 years ago. #myDockerBday

Deepak Bhaskaran, Engineer

Well, I’m from Porto Alegre, but today I live in Ireland. I learned everything Docker from the Porto Alegre community led by Cristiano and I also made great friends there. Today I work a lot using Docker (I’m a Freelancer) and I also help other women and black people to enter the infrastructure and development area using Docker. And last but not least, on Docker’s birthday last year I met a wonderful person who today is my husband (my husband is an excellent person, but loves to break production). Thank you for bringing me great friends and the love of my life.

Natalia Raythz, Developer

I use Docker everyday, since 2014, from Docker v0.9. I containerized all of my applications, speeding up my CI / CD with Docker.

Jintao Zhang, Engineer
The post #mydockerbday Recap + Community Stories appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

Deploy Stateful Docker Containers with Amazon ECS and Amazon EFS

At Docker, we are always looking for ways to make developers’ lives easier either directly or by working with our partners. Improving developer productivity is a core benefit of using Docker products and recently one of our partners made an announcement that makes developing cloud-native apps easier.

AWS announced that its customers can now configure their Amazon Elastic Container Service (ECS) applications deployed in Amazon Elastic Compute Cloud (EC2) mode to access Amazon Elastic File Storage (EFS) file systems. This is good news for Docker developers who use Amazon ECS. It means that Amazon ECS now natively integrates with Amazon EFS to automatically mount shared file systems into Docker containers. This allows you to deploy workloads that require access to shared storage such as machine learning workloads, containerizing legacy apps, or internal DevOps workloads such as GitLab, Jenkins, or Elasticsearch. 

The beauty of containerizing your applications is to provide a better way to create, package, and deploy software across different computing environments in a predictable and easy-to-manage way. Containers were originally designed to be stateless and ephemeral (temporary). A stateless application is one that neither reads nor stores information about its state from one time that it is run to the next. A stateful application, on the other hand, can remember some things about its state each time it runs.

Maintaining state in an app means finding a way to connect containers to stateful storage. For example, if you open up your weather app on your mobile device, it remembers your home city as the weather app maintains state. The only way to containerize applications that require state is to connect containers to stateful, persistent storage.

“Docker and AWS are collaborating on making the right workloads more easily deployed as stateful containerized applications. Docker’s industry-leading container technology including Docker Desktop and Docker Hub are integral to advancing developer workflows for modern apps. Our customers can now deploy and run Docker containers seamlessly on Amazon ECS and Amazon EFS, enabling development teams to ship apps faster,” according to Justin Graham, Vice President of Products for Docker.

If you are a developer who would like to deploy workloads that require access to shared external storage, highly-available regional storage, or high-throughput storage then the combination of Amazon ECS and Amazon EFS is your answer. Developers familiar with Amazon ECS can now use the ECS task definition to specify the file system ID and specific directory that they would like to mount on one or more containers in their task. ECS takes care of mounting the file-system on the container so that you can focus on your applications without having to worry about configuring infrastructure. 

If you are interested in how to actually deploy a stateful container-based application, AWS’ Martin Beeby has a great blog post that walks through how to configure Amazon EFS to add state to your containers running on Amazon ECS. Developers who are interested in learning more about how to get started with Docker can expand their understanding with these additional resources: Docker Desktop and Docker Hub.

The post Deploy Stateful Docker Containers with Amazon ECS and Amazon EFS appeared first on Docker Blog.
Quelle: https://blog.docker.com/feed/

How do I move data from MySQL to BigQuery?

In a market where streaming analytics is growing in popularity, it’s critical to optimize data processing so you can reduce costs and ensure data quality and integrity. One approach is to focus on working only with data that has changed instead of all available data. This is where change data capture (CDC) comes in handy. CDC is a technique that enables this optimized approach. Those of us working on Dataflow, Google Cloud’s streaming data processing service, developed a sample solution that lets you ingest a stream of changed data coming from any kind of MySQL database on versions 5.6 and above (self-managed, on-prem, etc.), and sync it to a dataset in BigQuery. We made this solution available within the public repository of Dataflow templates. You can find instructions on using the template in the README section of the GitHub repo. CDC provides a representation of data that has changed in a stream, allowing computations and processing to focus specifically on changed records. CDC can be applied for many use cases. Some examples include replication of a critical database, optimization of a real-time analytics job, cache invalidation, synchronization between a transactional data store and a data warehouse-type store, and more. How Dataflow’s CDC solution moves data from MySQL to BigQueryThe deployed solution, shown below, works with any MySQL database, which is monitored by a connector we developed based on Debezium. The connector stores table metadata using Data Catalog (Google Cloud’s scalable metadata management service) and pushes updates to Pub/Sub (Google Cloud-native stream ingestion and messaging technology). A Dataflow pipeline then takes those updates from Pub/Sub and syncs the MySQL database with a BigQuery dataset.This solution relies on Debezium, which is an excellent open source tool for CDC. We developed a configurable connector based on this technology that you can run locally or on your own Kubernetes environment to push change data to Pub/Sub.Click to enlargeUsing the Dataflow CDC solutionDeploying the solution consists of four steps:Deploy your database (nothing to do here if you already have a database)Create Pub/Sub topics for each of the tables you want to exportDeploy our Debezium-based connectorStart the Dataflow pipeline to consume data from Pub/Sub and synchronize to BigQueryLet’s suppose you have a MySQL database running in any environment. For each table in the database that you want to export, you must create a Pub/Sub topic and a corresponding subscription for that topic.Once the Pub/Sub topics and the database are in place, run the Debezium connector. The connector can run in many environments: locally built from source, via a Docker container, or on a Kubernetes cluster. For instructions on running the Debezium connector and the solution in general, check out the README for detailed instructions.Once the Debezium connector starts running and capturing changes from MySQL, it will push them to Pub/Sub. Using Data Catalog, it will also update schemas for the Pub/Sub topic that corresponds to each MySQL table.With all of these pieces in place, you can launch the Dataflow pipeline to consume the change data from Pub/Sub and synchronize it to BigQuery tables. The Dataflow job can be launched from the command line. Here’s what it looks like once you launch it:Once the connector and pipeline are running, you just need to monitor their progress, and make sure that it’s all going smoothly.Get started todayGot a use case that aligns with Dataflow’s CDC capabilities? For example, optimization of an existing real-time analytics job. If so, try it out! First, use this code to get started with building your first CDC pipeline in Dataflow today. And share your feedback with the Dataflow team in the GitHub issue tracker.At Google Cloud, we’re excited to bring CDC as an incredibly valuable technique to optimize streaming data analytics. We look forward to seeing both development and feedback with these new capabilities for Dataflow.
Quelle: Google Cloud Platform

Expert Advice: How to Start Selling on Your Website

Are you just taking your first steps selling a product or service online and don’t know where to begin? Be sure to register for our next 60-minute webinar, where our expert Happiness Engineers will walk you through the basics of eCommerce and show you how to set up your online store.

Date: Thursday, April 16, 2020Time: 5 p.m. UTC | 1 p.m. EDT | 12 p.m. CDT | 10 a.m. PDTCost: FreeWho’s invited: business owners, entrepreneurs, freelancers, service providers, store owners, and anyone else who wants to sell a product or service online.Registration link

Hosts Steve Dixon and Maddie Flynn are both veteran Happiness Engineers with years of experience helping business owners and hobbyists build and launch their eCommerce stores. They will provide step-by-step instructions on setting up:

Simple Payments — perfect for selling a single product or service.Recurring Payments — great for subscriptions and donations.WooCommerce — ideal for entrepreneurs who want to build an online store and automate sales.

No previous eCommerce experience is necessary, but we recommend a basic familiarity with WordPress.com to ensure you can make the most from the webinar. The presentation will conclude with a Q&A session (15-20 minutes), so you can already note down any questions you might have and bring them with you to the webinar.

Seats are limited, so register now to reserve your spot. See you then!
Quelle: RedHat Stack