I have a Ruby on Rails app I'm trying to containerize so that I can deploy it with Docker:
version: '3.4'
services:
db:
image: postgres
web:
container_name: my_rails_app
build: .
ports:
- "3000:3000"
depends_on:
- db
As part of the existing build process for this application, there currently exists a script which starts up a test version of the application with a bunch of sample data and takes multiple screenshots of various features of the application to be used in help files. This script is integrated into Ruby on Rails' asset pipeline, so it runs as part of the normal asset precompile process (the Dockerfile below is somewhat simplified):
FROM ruby:2.2.10 AS build
COPY . /opt/my_rails_app
RUN bundle install
# Generates static assets, including screenshots
RUN bundle exec rake assets:precompile
RUN bundle install --deployment --without development test assets
FROM ruby:2.2.10 AS production
ENV RAILS_ENV=production
COPY . /opt/my_rails_app
COPY --from=build vendor/bundle vendor/bundle
COPY --from=build public/assets public/assets
CMD ["bundle", "exec", "rails", "server"]
EXPOSE 3000
Now here's my problem: because this build step starts the web server in order to take screenshots of the UI, it needs to be able to connect to the database during the build. (The web server can't run properly without a database connection.)
In the old environment this wasn't an issue, as the production database was installed on the same server the application was so I could just have it connect to localhost. With Docker though, the database is running in a separate container managed by docker-compose. And unfortunately, Docker doesn't seem to want to let me talk to that container during the build:
could not translate host name "db" to address: Name or service not known
I considered just delaying the precompile step until after the deploy; but that would slow down the deploy process for containers significantly and require me to include another 50-100 dependencies in my production container which aren't otherwise used.
I also considered installing a database on the build container, but that seems like it'd slow builds down a lot and might cause issues if the database on the build container doesn't exactly match the one the postgres
image provides.
Is there a way to tell docker to start a container and set up the appropriate network connections during the build process? Or perhaps an alternative solution that doesn't have any of the downsides of the other solutions I already considered above?