4b6b45fe18 | ||
---|---|---|
.. | ||
__tests__ | ||
src | ||
LICENSE.md | ||
README.md | ||
RELEASES.md | ||
package-lock.json | ||
package.json | ||
tsconfig.json |
README.md
@actions/cache
Functions necessary for caching dependencies and build outputs to improve workflow execution time.
See "Caching dependencies to speed up workflows" for how caching works.
Note that GitHub will remove any cache entries that have not been accessed in over 7 days. There is no limit on the number of caches you can store, but the total size of all caches in a repository is limited to 10 GB. If you exceed this limit, GitHub will save your cache but will begin evicting caches until the total size is less than 10 GB.
Usage
This package is used by the v2+ versions of our first party cache action. You can find an example implementation in the cache repo here.
Save Cache
Saves a cache containing the files in paths
using the key
provided. The files would be compressed using zstandard compression algorithm if zstd is installed, otherwise gzip is used. Function returns the cache id if the cache was saved succesfully and throws an error if cache upload fails.
const cache = require('@actions/cache');
const paths = [
'node_modules',
'packages/*/node_modules/'
]
const key = 'npm-foobar-d5ea0750'
const cacheId = await cache.saveCache(paths, key)
Restore Cache
Restores a cache based on key
and restoreKeys
to the paths
provided. Function returns the cache key for cache hit and returns undefined if cache not found.
const cache = require('@actions/cache');
const paths = [
'node_modules',
'packages/*/node_modules/'
]
const key = 'npm-foobar-d5ea0750'
const restoreKeys = [
'npm-foobar-',
'npm-'
]
const cacheKey = await cache.restoreCache(paths, key, restoreKeys)
Cache segment restore timeout
Starting v3.0.5
of actions/toolkit
, in case any issue occurs while downloading the cache, the download will be aborted by default within 1 hour if any segment
doesn't download completely. A segment
is limited to size of 1GB
for a 32-bit
runner and 2GB
for a 64-bit
runner. So for any cache that exceeds the size of one segment, multiple segments will be downloaded in sequence to complete the download.
To customise the segment
download timeout, an environment variable SEGMENT_DOWNLOAD_TIMEOUT_MINS
needs to be set with the timeout minutes. This way the download wouldn't get stuck forever and proceed to next step in the workflow without any problem.