I’m new to DevOps and trying to figure out deployment automation for my web app.
I created a basic full-stack project and got it running on AWS EC2 (free tier only).
My current setup works like this:
- GitHub Actions compiles everything and pushes a .jar artifact to S3 bucket
- AWS CodeDeploy grabs the .jar from S3 and installs it on my EC2 server
- Docker containers handle both frontend (Next.js) and backend (Spring Boot) services
- Nginx acts as reverse proxy to direct traffic correctly
Questions I have
- Does this approach make sense for automated deployments when using just one EC2 free tier instance?
- My EC2 storage keeps filling up. Is the GitHub Actions → S3 → CodeDeploy → Docker workflow causing this?
- What are good practices or other options for simple, continuous CI/CD on EC2 FREE TIER?
This is just a learning project, but I want it to stay stable long-term even though it’s basic.
Looking for suggestions to make this better, or important concepts I should learn about managing deployment pipelines on EC2
your workflow’s solid for learning, but storage will definitely bite u. use docker buildkit with --rm
flags and set up a weekly cron job to dump old images. skip s3 and go with github container registry - it’s free and plays nicer with actions.
Your storage problem is probably from Docker layer caching and old containers piling up. Every deployment makes new layers, and Docker won’t clean up automatically unless you tell it to. Add docker system prune
commands to your deployment scripts to fix this. The S3 step you’re using actually works well for keeping CI and CD separate. But since you’re just running one EC2 instance, you’d save resources by having GitHub Actions build and push images straight to ECR, then pull them during deployment. This cuts down on artifact storage and makes your workflow simpler. To stay within free tier limits, watch your instance metrics and use Docker multi-stage builds to shrink your final images. Smaller images mean less storage used and faster deployments.
interesting setup! have you tried cleaning up old docker images after deployments? that might be what’s using up your storage. also, why not pull directly from GitHub during deployment instead of using S3? it seems like an extra step you could skip on the free tier.