how to find MAX memory from docker stats?

2020-06-30 13:23发布

With docker stats you can see the memory usage of a container over time.

Is there a way to find what the highest value of memory usage was while running docker stats?

标签: docker
4条回答
做自己的国王
2楼-- · 2020-06-30 13:28

you can use command:

docker stats --no-stream | awk '{ print $3 }' | sed '1d'|sort | tail -1

It will give highest memory by container.

Let me Explain command:

 --no-stream :          Disable streaming stats and only pull the first result
 awk '{ print $3 }' :   will print MEM USAGE
 sed '1d' :             will delete first entry that is %
 sort :                 it will sort the result
 tail -1 :              it will give last entry that is highest. 
查看更多
混吃等死
3楼-- · 2020-06-30 13:33

I took a sampling script from here and aggregated data by @pl_rock. But be careful - the sort command only compares string values - so the results are usually wrong (but ok for me). Also mind that docker is sometimes reporting wrong numbers (ie. more allocated mem than physical RAM).

Here is the script:

#!/bin/bash

"$@" & # Run the given command line in the background.
pid=$!

echo "" > stats

while true; do
  sleep 1
  sample="$(ps -o rss= $pid 2> /dev/null)" || break

  docker stats --no-stream --format "{{.MemUsage}} {{.Name}} {{.Container}}" | awk '{ print strftime("%Y-%m-%d %H:%M:%S"), $0 }' >> stats
done

for containerid in `awk '/.+/ { print $7 }' stats | sort | uniq`
do
    grep "$containerid" stats | sort -r -k3 | tail -n 1
    # maybe: | sort -r -k3 -h | head -n 1
    # see comment below (didnt tested)
done
查看更多
Lonely孤独者°
4楼-- · 2020-06-30 13:37

In my case I wanted to monitor a docker container which runs tests for my web application. The test suite is pretty big, they include javascript tests in a real browser and consume significant amount of both, memory and time.

Ideally, I wanted to watch the current memory usage real time, but to also keep the history for the later analysis.

I ended up with a modified and simplified version of the Keiran's solution:

CONTAINER=$(docker ps -q -f name=CONTAINER_NAME)
FORMAT='{{.MemPerc}}\t{{.MemUsage}}\t{{.Name}}'

docker stats --format $FORMAT $CONTAINER | sed -u 's/\x1b\[[0-9;]*[a-zA-Z]//g' | tee stats

Notes:

  • CONTAINER=$(docker ps -q -f name=NAME) # find container by name, but there are other options
  • FORMAT='{{.MemPerc}} ...}} # MemPerc goes first (for sorting); otherwise you can be creative
  • sed -u # the -u flag is important, it turns off buffering
  • | sed -u 's/\x1b\[[0-9;]*[a-zA-Z]//g' # removes ANSI escape sequences
  • | tee stats # not only show real time, but also write into the stats file
  • I Ctrl-C manually when it's ready – not ideal, but OK for me
  • after that it's easy to find the max with something like sort -n stats | tail
查看更多
Explosion°爆炸
5楼-- · 2020-06-30 13:46

If you need to find the peak usage you are better off requesting the .MemPerc option and calculating based on the total memory (unless you restricted the memory available to the container). .MemUsage has units which change during the life of the container which mess with the result.

docker stats --format 'CPU: {{.CPUPerc}}\tMEM: {{.MemPerc}}'

You can stream an ongoing log to a file (or script).

To get just the max memory as originally requested:

(timeout 120 docker stats --format '{{.MemPerc}}' <CONTAINER_ID> \
  | sed 's/\x1b\[[0-9;]*[a-zA-Z]//g' ; echo) \
  | tr -d '%' | sort -k1,1n | tail -n 1

And then you can ask the system for its total RAM (again assuming you didn't limit the RAM available to docker) and calculate:

awk '/MemTotal/ {print $2}' /proc/meminfo

You would need to know how long the container is going to run when using timeout as above, but if docker stats was run without this in background submitted by a script it could kill it once the container completed.

...

This command allows you to generate a time-series of the cpu/memory load:

(timeout 20 docker stats --format \
  'CPU: {{.CPUPerc}}\tMEM: {{.MemPerc}}' <CONTAINER_ID> \
  | sed 's/\x1b\[[0-9;]*[a-zA-Z]//g' ; echo) \
  | gzip -c > monitor.log.gz

Note that it pipes into gzip. In this form you get ~2 rows per second so the file would get large rapidly if you don't.

I'd advise this for benchmarking and trouble shooting rather than use on production containers

查看更多
登录 后发表回答