![]() * bsd + zstd fallback implementation * bsd + zstd fallback implementation * Fix tar operations * Add -v option for testing * Fix order of args for tar * Add GNUtar as default on windows * Fix test * Fix tar tests * Fix lint issues * Fix windows gnutar test case * Temporarily remove thhe condition that prevents zstd usage on windows unless with GNUtar * Address some comments and correct compression commands * Add windows bsdtar test * Fix windows test * Fix test * Separate args * Fix old tests * Add new tests * Fix tests * Fix lint test * Refactor code * Address review comments * Fix test * Fix tar test * Add await to async function calls * Fix test * Update for beta release * Fix audit issues * Add fallback to gzip compression if cache not found * Fix test * Add test * Address review comments * Revert Address review comments * Release 3.1.0-beta.2 cache package * Fix issues * Reconfigure catch block * Add debug logging for gzip fall back * Fix test * Add end to end test for cache using bsd on windows and address review comments * Fix test * Fix test * Fix tests * Add better comments * Update packages/cache/src/internal/cacheHttpClient.ts Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com> * Address review comments * Update for new beta cache package release * Address bugbash issues * Fix tests * Release new actions/cache minor version Co-authored-by: Lovepreet Singh <pdotl@github.com> Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com> |
||
---|---|---|
.. | ||
__tests__ | ||
src | ||
LICENSE.md | ||
README.md | ||
RELEASES.md | ||
package-lock.json | ||
package.json | ||
tsconfig.json |
README.md
@actions/cache
Functions necessary for caching dependencies and build outputs to improve workflow execution time.
See "Caching dependencies to speed up workflows" for how caching works.
Note that GitHub will remove any cache entries that have not been accessed in over 7 days. There is no limit on the number of caches you can store, but the total size of all caches in a repository is limited to 10 GB. If you exceed this limit, GitHub will save your cache but will begin evicting caches until the total size is less than 10 GB.
Usage
This package is used by the v2+ versions of our first party cache action. You can find an example implementation in the cache repo here.
Save Cache
Saves a cache containing the files in paths
using the key
provided. The files would be compressed using zstandard compression algorithm if zstd is installed, otherwise gzip is used. Function returns the cache id if the cache was saved succesfully and throws an error if cache upload fails.
const cache = require('@actions/cache');
const paths = [
'node_modules',
'packages/*/node_modules/'
]
const key = 'npm-foobar-d5ea0750'
const cacheId = await cache.saveCache(paths, key)
Restore Cache
Restores a cache based on key
and restoreKeys
to the paths
provided. Function returns the cache key for cache hit and returns undefined if cache not found.
const cache = require('@actions/cache');
const paths = [
'node_modules',
'packages/*/node_modules/'
]
const key = 'npm-foobar-d5ea0750'
const restoreKeys = [
'npm-foobar-',
'npm-'
]
const cacheKey = await cache.restoreCache(paths, key, restoreKeys)
Cache segment restore timeout
A cache gets downloaded in multiple segments of fixed sizes (1GB
for a 32-bit
runner and 2GB
for a 64-bit
runner). Sometimes, a segment download gets stuck which causes the workflow job to be stuck forever and fail. Version v3.0.4
of cache package introduces a segment download timeout. The segment download timeout will allow the segment download to get aborted and hence allow the job to proceed with a cache miss.
Default value of this timeout is 60 minutes and can be customized by specifying an environment variable named SEGMENT_DOWNLOAD_TIMEOUT_MINS
with timeout value in minutes.