Hi,
We were trying out the hard CPU limit as per docs
https://github.com/cloudfoundry/garden-runc-release/releases?after=v1.9.2
|
GitHub is where people build software. More than 27 million people use GitHub to discover, fork, and contribute to over 80 million projects.
|
according to formula for single core machine,
APP_MEM_IN_MB * 100us / 1000 = MILLISECONDS_PER_100_MILLISECOND_PERIOD
In our tests, to get 6.4% CPU usage on a 64MB application and ~100% for a 1GB application we had to set the 'cpu_quota_per_share_in_us' to 3200. (Cell has 4 core and 16GB or ram, overcommit factor of 2).
That changes the formula to:
APP_MEM_IN_MB * 100us / 32000 = MILLISECONDS_PER_100_MILLISECOND_PERIOD
Can you help us to understand from where this 'times 32' comes from? Is the total available RAM of the cell (16GB * overcommit of 2) and number of cpu cores does not matter?
Thanks and regards,
Michael