How to deploy an Angular application and a REST Ap

2020-05-03 09:47发布

问题:

I have a very similar question as this one.

I do have an Angular application that collects data which are then processed via a REST Api. I can happily dockerize both applications and they run fine locally, however, when I try deploy them to make them accessible from "everywhere" I can only reach the front end, but the connection to REST Api is not functional.

Inside my Angular app, I have a file baseurl.ts. That just contains:

export const baseURL = 'http://localhost:3000/';

I make the application production ready using:

ng build --prod

which creates the dist folder and then the following docker container (taken from here):

FROM node:alpine AS builder

ARG PROXY=http://myproxy
ARG NOPROXY=localhost,127.0.0.1

ENV http_proxy=${PROXY}
ENV https_proxy=${PROXY}
ENV NO_PROXY=${NOPROXY}
ENV no_proxy=${NOPROXY}

WORKDIR /app

COPY . .

RUN npm install && \
    npm run build

FROM nginx:alpine

COPY --from=builder /app/dist/* /usr/share/nginx/html/

I build the container using

docker build -t form_angular:v1

and run it using

docker run -d -p8088:80 form_angular:v1

The second Dockerfile for the REST Api looks like this:

FROM continuumio/miniconda3

ARG PROXY=http://myproxy
ARG NOPROXY=localhost,127.0.0.1

ENV http_proxy=${PROXY}
ENV https_proxy=${PROXY}
ENV NO_PROXY=${NOPROXY}
ENV no_proxy=${NOPROXY} 

COPY my_environment.yml my_environment.yml
SHELL ["/bin/bash", "-c"] 

RUN echo "Using proxy $PROXY" \
    && touch /etc/apt/apt.conf \
    && echo "Acquire::http::Proxy \"$PROXY\";" >> /etc/apt/apt.conf \
    && cat /etc/apt/apt.conf \
    && apt-get -q -y update \
    && DEBIAN_FRONTEND=noninteractive apt-get -q -y upgrade \
    && apt-get -q -y install \
       build-essential \        
    && apt-get -q clean \
    && rm -rf /var/lib/apt/lists/*

RUN ["conda", "env", "create", "-f", "my_environment.yml"]
COPY user_feedback.py user_feedback.py
CMD source activate my_environment; gunicorn -b 0.0.0.0:3000 user_feedback:app

Building:

docker build -t form_rest:latest .

Running:

docker run --name form_rest -d -p 3000:3000

As I said, that all works as expected when running on the localhost. How do I now make these two containers talk to each other for "global" deployment?

回答1:

Your baseURL is hardcoded to localhost. For "global" deployment you would need to change the baseURL to point to the global endpoint of your REST api. That will require you to know the global endpoint and it would need to be static.

Another option would be to set baseURL to /api for prod and configure the angular nginx to proxy /api to your REST api. You would need to link the containers for that to work but wouldn't need to expose the public port on the REST api container, it will only be proxied through nginx.

I use the nginx proxy option for my projects and use docker-compose to handle all linking and inter-container communication stuff.

Example docker-compose.yml and nginx.conf files. This is taken from what I'm currently using, think it should work for you.

docker-compose.yml

version: '3.4'
services:
  nginx:
    container_name: nginx
    image: form_angular
    build:
      context: .
      dockerfile: <path to angular/nginx dockerfile>
    ports:
      - 8088:80
    networks:
      - my-network
  restapi:
    container_name: restapi
    image: form_rest
    build:
      context: .
      dockerfile: <path to rest dockerfile>
    networks:
      - my-network
networks:
  my-network:
    driver: bridge

nginx.conf:

events {
  worker_connections 1024;
}
http {
  upstream api {
    server restapi:3000;
  }
  server {
    server_name nginx;
    root /usr/share/nginx/html;
    index index.html;
    include /etc/nginx/mime.types;
    location /api/ {
      proxy_pass http://api;
      proxy_http_version 1.1;
      proxy_set_header Upgrade $http_upgrade;
      proxy_set_header Connection 'upgrade';
      proxy_set_header Host $host;
      proxy_set_header X-Real-IP $remote_addr;
      proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header X-NginX-Proxy true;
      proxy_cache_bypass $http_upgrade;
    }
    location /assets/ {
      access_log off;
      expires 1d;
    }
    location ~ \.(css|js|svg|ico)$ {
      access_log off;
      expires 1d;
    }
    location / {
      try_files $uri /index.html;
    }
  }
}


回答2:

When you use localhost in a container, it means the container itself, not the host running the container. So if you're pointing to "localhost" from a second container (the UI in your case), that new container will look at itself, and doesn't find the API.

One of the option to solve your problem is to make your containers reachable by name.

The easiest way to do it, in your case, is using docker-compose: eg:

version: '3'
  services:
    angular:
      image: "form_angular:v1"
      container_name: "form_angular"
      ports:
        - "8088:80"
      external_links:
        - restapi

    restapi:
      image: "form_rest:latest"
      container_name: "form_rest"
      ports:
        - "3000:3000"
      external_links:
        - angular

And with that, from the angular you can reach the restapi using the name (as a DNS) restapi, and angular from the restapi.

I suggest you to read more about docker-compose at https://docs.docker.com/compose/

It is very versatile and easy, and you can live with it for a long time, until you decide to build your own cloud ;)