mirror of https://github.com/actions/toolkit
feat: adds warp-cache
parent
fe3e7ce9a7
commit
c7fc05d955
|
@ -0,0 +1,9 @@
|
||||||
|
The MIT License (MIT)
|
||||||
|
|
||||||
|
Copyright 2019 GitHub
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
@ -0,0 +1,51 @@
|
||||||
|
# `@actions/cache`
|
||||||
|
|
||||||
|
> Functions necessary for caching dependencies and build outputs to improve workflow execution time.
|
||||||
|
|
||||||
|
See ["Caching dependencies to speed up workflows"](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows) for how caching works.
|
||||||
|
|
||||||
|
Note that GitHub will remove any cache entries that have not been accessed in over 7 days. There is no limit on the number of caches you can store, but the total size of all caches in a repository is limited to 10 GB. If you exceed this limit, GitHub will save your cache but will begin evicting caches until the total size is less than 10 GB.
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
This package is used by the v2+ versions of our first party cache action. You can find an example implementation in the cache repo [here](https://github.com/actions/cache).
|
||||||
|
|
||||||
|
#### Save Cache
|
||||||
|
|
||||||
|
Saves a cache containing the files in `paths` using the `key` provided. The files would be compressed using zstandard compression algorithm if zstd is installed, otherwise gzip is used. Function returns the cache id if the cache was saved succesfully and throws an error if cache upload fails.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const cache = require('@actions/cache');
|
||||||
|
const paths = [
|
||||||
|
'node_modules',
|
||||||
|
'packages/*/node_modules/'
|
||||||
|
]
|
||||||
|
const key = 'npm-foobar-d5ea0750'
|
||||||
|
const cacheId = await cache.saveCache(paths, key)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Restore Cache
|
||||||
|
|
||||||
|
Restores a cache based on `key` and `restoreKeys` to the `paths` provided. Function returns the cache key for cache hit and returns undefined if cache not found.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const cache = require('@actions/cache');
|
||||||
|
const paths = [
|
||||||
|
'node_modules',
|
||||||
|
'packages/*/node_modules/'
|
||||||
|
]
|
||||||
|
const key = 'npm-foobar-d5ea0750'
|
||||||
|
const restoreKeys = [
|
||||||
|
'npm-foobar-',
|
||||||
|
'npm-'
|
||||||
|
]
|
||||||
|
const cacheKey = await cache.restoreCache(paths, key, restoreKeys)
|
||||||
|
```
|
||||||
|
|
||||||
|
##### Cache segment restore timeout
|
||||||
|
|
||||||
|
A cache gets downloaded in multiple segments of fixed sizes (now `128MB` to fail-fast, previously `1GB` for a `32-bit` runner and `2GB` for a `64-bit` runner were used). Sometimes, a segment download gets stuck which causes the workflow job to be stuck forever and fail. Version `v3.0.4` of cache package introduces a segment download timeout. The segment download timeout will allow the segment download to get aborted and hence allow the job to proceed with a cache miss.
|
||||||
|
|
||||||
|
Default value of this timeout is 10 minutes (starting `v3.2.1` and higher, previously 60 minutes in versions between `v.3.0.4` and `v3.2.0`, both included) and can be customized by specifying an [environment variable](https://docs.github.com/en/actions/learn-github-actions/environment-variables) named `SEGMENT_DOWNLOAD_TIMEOUT_MINS` with timeout value in minutes.
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,166 @@
|
||||||
|
# @actions/cache Releases
|
||||||
|
|
||||||
|
### 0.1.0
|
||||||
|
|
||||||
|
- Initial release
|
||||||
|
|
||||||
|
### 0.2.0
|
||||||
|
|
||||||
|
- Fixes issues with the zstd compression algorithm on Windows and Ubuntu 16.04 [#469](https://github.com/actions/toolkit/pull/469)
|
||||||
|
|
||||||
|
### 0.2.1
|
||||||
|
|
||||||
|
- Fix to await async function getCompressionMethod
|
||||||
|
|
||||||
|
### 1.0.0
|
||||||
|
|
||||||
|
- Downloads Azure-hosted caches using the Azure SDK for speed and reliability
|
||||||
|
- Displays download progress
|
||||||
|
- Includes changes that break compatibility with earlier versions, including:
|
||||||
|
- `retry`, `retryTypedResponse`, and `retryHttpClientResponse` moved from `cacheHttpClient` to `requestUtils`
|
||||||
|
|
||||||
|
### 1.0.1
|
||||||
|
|
||||||
|
- Fix bug in downloading large files (> 2 GBs) with the Azure SDK
|
||||||
|
|
||||||
|
### 1.0.2
|
||||||
|
|
||||||
|
- Use posix archive format to add support for some tools
|
||||||
|
|
||||||
|
### 1.0.3
|
||||||
|
|
||||||
|
- Use http-client v1.0.9
|
||||||
|
- Fixes error handling so retries are not attempted on non-retryable errors (409 Conflict, for example)
|
||||||
|
- Adds 5 second delay between retry attempts
|
||||||
|
|
||||||
|
### 1.0.4
|
||||||
|
|
||||||
|
- Use @actions/core v1.2.6
|
||||||
|
- Fixes uploadChunk to throw an error if any unsuccessful response code is received
|
||||||
|
|
||||||
|
### 1.0.5
|
||||||
|
|
||||||
|
- Fix to ensure Windows cache paths get resolved correctly
|
||||||
|
|
||||||
|
### 1.0.6
|
||||||
|
|
||||||
|
- Make caching more verbose [#650](https://github.com/actions/toolkit/pull/650)
|
||||||
|
- Use GNU tar on macOS if available [#701](https://github.com/actions/toolkit/pull/701)
|
||||||
|
|
||||||
|
### 1.0.7
|
||||||
|
|
||||||
|
- Fixes permissions issue extracting archives with GNU tar on macOS ([issue](https://github.com/actions/cache/issues/527))
|
||||||
|
|
||||||
|
### 1.0.8
|
||||||
|
|
||||||
|
- Increase the allowed artifact cache size from 5GB to 10GB ([issue](https://github.com/actions/cache/discussions/497))
|
||||||
|
|
||||||
|
### 1.0.9
|
||||||
|
|
||||||
|
- Use @azure/ms-rest-js v2.6.0
|
||||||
|
- Use @azure/storage-blob v12.8.0
|
||||||
|
|
||||||
|
### 1.0.10
|
||||||
|
|
||||||
|
- Update `lockfileVersion` to `v2` in `package-lock.json [#1022](https://github.com/actions/toolkit/pull/1022)
|
||||||
|
|
||||||
|
### 1.0.11
|
||||||
|
|
||||||
|
- Fix file downloads > 2GB([issue](https://github.com/actions/cache/issues/773))
|
||||||
|
|
||||||
|
### 2.0.0
|
||||||
|
|
||||||
|
- Added support to check if Actions cache service feature is available or not [#1028](https://github.com/actions/toolkit/pull/1028)
|
||||||
|
|
||||||
|
### 2.0.3
|
||||||
|
|
||||||
|
- Update to v2.0.0 of `@actions/http-client`
|
||||||
|
|
||||||
|
### 2.0.4
|
||||||
|
|
||||||
|
- Update to v2.0.1 of `@actions/http-client` [#1087](https://github.com/actions/toolkit/pull/1087)
|
||||||
|
|
||||||
|
### 2.0.5
|
||||||
|
|
||||||
|
- Fix to avoid saving empty cache when no files are available for caching. ([issue](https://github.com/actions/cache/issues/624))
|
||||||
|
|
||||||
|
### 2.0.6
|
||||||
|
|
||||||
|
- Fix `Tar failed with error: The process '/usr/bin/tar' failed with exit code 1` issue when temp directory where tar is getting created is actually the subdirectory of the path mentioned by the user for caching. ([issue](https://github.com/actions/cache/issues/689))
|
||||||
|
|
||||||
|
### 3.0.0
|
||||||
|
|
||||||
|
- Updated actions/cache to suppress Actions cache server error and log warning for those error [#1122](https://github.com/actions/toolkit/pull/1122)
|
||||||
|
|
||||||
|
### 3.0.1
|
||||||
|
|
||||||
|
- Fix [#833](https://github.com/actions/cache/issues/833) - cache doesn't work with github workspace directory.
|
||||||
|
- Fix [#809](https://github.com/actions/cache/issues/809) `zstd -d: no such file or directory` error on AWS self-hosted runners.
|
||||||
|
|
||||||
|
### 3.0.2
|
||||||
|
|
||||||
|
- Added 1 hour timeout for the download stuck issue [#810](https://github.com/actions/cache/issues/810).
|
||||||
|
|
||||||
|
### 3.0.3
|
||||||
|
|
||||||
|
- Bug fixes for download stuck issue [#810](https://github.com/actions/cache/issues/810).
|
||||||
|
|
||||||
|
### 3.0.4
|
||||||
|
|
||||||
|
- Fix zstd not working for windows on gnu tar in issues [#888](https://github.com/actions/cache/issues/888) and [#891](https://github.com/actions/cache/issues/891).
|
||||||
|
- Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes.
|
||||||
|
|
||||||
|
### 3.0.5
|
||||||
|
|
||||||
|
- Update `@actions/cache` to use `@actions/core@^1.10.0`
|
||||||
|
|
||||||
|
### 3.0.6
|
||||||
|
|
||||||
|
- Added `@azure/abort-controller` to dependencies to fix compatibility issue with ESM [#1208](https://github.com/actions/toolkit/issues/1208)
|
||||||
|
|
||||||
|
### 3.1.0-beta.1
|
||||||
|
|
||||||
|
- Update actions/cache on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984))
|
||||||
|
|
||||||
|
### 3.1.0-beta.2
|
||||||
|
|
||||||
|
- Added support for fallback to gzip to restore old caches on windows.
|
||||||
|
|
||||||
|
### 3.1.0-beta.3
|
||||||
|
|
||||||
|
- Bug Fixes for fallback to gzip to restore old caches on windows and bsdtar if gnutar is not available.
|
||||||
|
|
||||||
|
### 3.1.0
|
||||||
|
|
||||||
|
- Update actions/cache on windows to use gnu tar and zstd by default
|
||||||
|
- Update actions/cache on windows to fallback to bsdtar and zstd if gnu tar is not available.
|
||||||
|
- Added support for fallback to gzip to restore old caches on windows.
|
||||||
|
|
||||||
|
### 3.1.1
|
||||||
|
|
||||||
|
- Reverted changes in 3.1.0 to fix issue with symlink restoration on windows.
|
||||||
|
- Added support for verbose logging about cache version during cache miss.
|
||||||
|
|
||||||
|
### 3.1.2
|
||||||
|
|
||||||
|
- Fix issue with symlink restoration on windows.
|
||||||
|
|
||||||
|
### 3.1.3
|
||||||
|
|
||||||
|
- Fix to prevent from setting MYSYS environement variable globally [#1329](https://github.com/actions/toolkit/pull/1329).
|
||||||
|
|
||||||
|
### 3.1.4
|
||||||
|
|
||||||
|
- Fix zstd not being used due to `zstd --version` output change in zstd 1.5.4 release. See [#1353](https://github.com/actions/toolkit/pull/1353).
|
||||||
|
|
||||||
|
### 3.2.0
|
||||||
|
|
||||||
|
- Add `lookupOnly` to cache restore `DownloadOptions`.
|
||||||
|
|
||||||
|
### 3.2.1
|
||||||
|
|
||||||
|
- Updated @azure/storage-blob to `v12.13.0`
|
||||||
|
|
||||||
|
### 3.2.2
|
||||||
|
|
||||||
|
- Add new default cache download method to improve performance and reduce hangs [#1484](https://github.com/actions/toolkit/pull/1484)
|
|
@ -0,0 +1,5 @@
|
||||||
|
name: 'Set env variables'
|
||||||
|
description: 'Sets certain env variables so that e2e restore and save cache can be tested in a shell'
|
||||||
|
runs:
|
||||||
|
using: 'node12'
|
||||||
|
main: 'index.js'
|
|
@ -0,0 +1 @@
|
||||||
|
hello world
|
|
@ -0,0 +1,14 @@
|
||||||
|
// Certain env variables are not set by default in a shell context and are only available in a node context from a running action
|
||||||
|
// In order to be able to restore and save cache e2e in a shell when running CI tests, we need these env variables set
|
||||||
|
const fs = require('fs');
|
||||||
|
const os = require('os');
|
||||||
|
const filePath = process.env[`GITHUB_ENV`]
|
||||||
|
fs.appendFileSync(filePath, `ACTIONS_RUNTIME_TOKEN=${process.env.ACTIONS_RUNTIME_TOKEN}${os.EOL}`, {
|
||||||
|
encoding: 'utf8'
|
||||||
|
})
|
||||||
|
fs.appendFileSync(filePath, `ACTIONS_CACHE_URL=${process.env.ACTIONS_CACHE_URL}${os.EOL}`, {
|
||||||
|
encoding: 'utf8'
|
||||||
|
})
|
||||||
|
fs.appendFileSync(filePath, `GITHUB_RUN_ID=${process.env.GITHUB_RUN_ID}${os.EOL}`, {
|
||||||
|
encoding: 'utf8'
|
||||||
|
})
|
|
@ -0,0 +1,14 @@
|
||||||
|
import * as cache from '../src/cache'
|
||||||
|
|
||||||
|
test('isFeatureAvailable returns true if server url is set', () => {
|
||||||
|
try {
|
||||||
|
process.env['ACTIONS_CACHE_URL'] = 'http://cache.com'
|
||||||
|
expect(cache.isFeatureAvailable()).toBe(true)
|
||||||
|
} finally {
|
||||||
|
delete process.env['ACTIONS_CACHE_URL']
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
test('isFeatureAvailable returns false if server url is not set', () => {
|
||||||
|
expect(cache.isFeatureAvailable()).toBe(false)
|
||||||
|
})
|
|
@ -0,0 +1,167 @@
|
||||||
|
import {downloadCache, getCacheVersion} from '../src/internal/cacheHttpClient'
|
||||||
|
import {CompressionMethod} from '../src/internal/constants'
|
||||||
|
import * as downloadUtils from '../src/internal/downloadUtils'
|
||||||
|
import {DownloadOptions, getDownloadOptions} from '../src/options'
|
||||||
|
|
||||||
|
jest.mock('../src/internal/downloadUtils')
|
||||||
|
|
||||||
|
test('getCacheVersion with one path returns version', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const result = getCacheVersion(paths, undefined, true)
|
||||||
|
expect(result).toEqual(
|
||||||
|
'b3e0c6cb5ecf32614eeb2997d905b9c297046d7cbf69062698f25b14b4cb0985'
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getCacheVersion with multiple paths returns version', async () => {
|
||||||
|
const paths = ['node_modules', 'dist']
|
||||||
|
const result = getCacheVersion(paths, undefined, true)
|
||||||
|
expect(result).toEqual(
|
||||||
|
'165c3053bc646bf0d4fac17b1f5731caca6fe38e0e464715c0c3c6b6318bf436'
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getCacheVersion with zstd compression returns version', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const result = getCacheVersion(paths, CompressionMethod.Zstd, true)
|
||||||
|
|
||||||
|
expect(result).toEqual(
|
||||||
|
'273877e14fd65d270b87a198edbfa2db5a43de567c9a548d2a2505b408befe24'
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getCacheVersion with gzip compression returns version', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const result = getCacheVersion(paths, CompressionMethod.Gzip, true)
|
||||||
|
|
||||||
|
expect(result).toEqual(
|
||||||
|
'470e252814dbffc9524891b17cf4e5749b26c1b5026e63dd3f00972db2393117'
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getCacheVersion with enableCrossOsArchive as false returns version on windows', async () => {
|
||||||
|
if (process.platform === 'win32') {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const result = getCacheVersion(paths)
|
||||||
|
|
||||||
|
expect(result).toEqual(
|
||||||
|
'2db19d6596dc34f51f0043120148827a264863f5c6ac857569c2af7119bad14e'
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
test('downloadCache uses http-client for non-Azure URLs', async () => {
|
||||||
|
const downloadCacheHttpClientMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheHttpClient'
|
||||||
|
)
|
||||||
|
const downloadCacheStorageSDKMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheStorageSDK'
|
||||||
|
)
|
||||||
|
|
||||||
|
const archiveLocation = 'http://www.actionscache.test/download'
|
||||||
|
const archivePath = '/foo/bar'
|
||||||
|
|
||||||
|
await downloadCache(archiveLocation, archivePath)
|
||||||
|
|
||||||
|
expect(downloadCacheHttpClientMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(downloadCacheHttpClientMock).toHaveBeenCalledWith(
|
||||||
|
archiveLocation,
|
||||||
|
archivePath
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(downloadCacheStorageSDKMock).toHaveBeenCalledTimes(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('downloadCache uses storage SDK for Azure storage URLs', async () => {
|
||||||
|
const downloadCacheHttpClientMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheHttpClient'
|
||||||
|
)
|
||||||
|
const downloadCacheStorageSDKMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheStorageSDK'
|
||||||
|
)
|
||||||
|
|
||||||
|
const downloadCacheHttpClientConcurrentMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheHttpClientConcurrent'
|
||||||
|
)
|
||||||
|
|
||||||
|
const archiveLocation = 'http://foo.blob.core.windows.net/bar/baz'
|
||||||
|
const archivePath = '/foo/bar'
|
||||||
|
|
||||||
|
await downloadCache(archiveLocation, archivePath)
|
||||||
|
|
||||||
|
expect(downloadCacheHttpClientConcurrentMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(downloadCacheHttpClientConcurrentMock).toHaveBeenCalledWith(
|
||||||
|
archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
getDownloadOptions()
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(downloadCacheStorageSDKMock).toHaveBeenCalledTimes(0)
|
||||||
|
expect(downloadCacheHttpClientMock).toHaveBeenCalledTimes(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('downloadCache passes options to download methods', async () => {
|
||||||
|
const downloadCacheHttpClientMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheHttpClient'
|
||||||
|
)
|
||||||
|
const downloadCacheStorageSDKMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheStorageSDK'
|
||||||
|
)
|
||||||
|
|
||||||
|
const downloadCacheHttpClientConcurrentMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheHttpClientConcurrent'
|
||||||
|
)
|
||||||
|
|
||||||
|
const archiveLocation = 'http://foo.blob.core.windows.net/bar/baz'
|
||||||
|
const archivePath = '/foo/bar'
|
||||||
|
const options: DownloadOptions = {downloadConcurrency: 4}
|
||||||
|
|
||||||
|
await downloadCache(archiveLocation, archivePath, options)
|
||||||
|
|
||||||
|
expect(downloadCacheHttpClientConcurrentMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(downloadCacheHttpClientConcurrentMock).toHaveBeenCalled()
|
||||||
|
expect(downloadCacheHttpClientConcurrentMock).toHaveBeenCalledWith(
|
||||||
|
archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
getDownloadOptions(options)
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(downloadCacheStorageSDKMock).toHaveBeenCalledTimes(0)
|
||||||
|
expect(downloadCacheHttpClientMock).toHaveBeenCalledTimes(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('downloadCache uses http-client when overridden', async () => {
|
||||||
|
const downloadCacheHttpClientMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheHttpClient'
|
||||||
|
)
|
||||||
|
const downloadCacheStorageSDKMock = jest.spyOn(
|
||||||
|
downloadUtils,
|
||||||
|
'downloadCacheStorageSDK'
|
||||||
|
)
|
||||||
|
|
||||||
|
const archiveLocation = 'http://foo.blob.core.windows.net/bar/baz'
|
||||||
|
const archivePath = '/foo/bar'
|
||||||
|
const options: DownloadOptions = {
|
||||||
|
useAzureSdk: false,
|
||||||
|
concurrentBlobDownloads: false
|
||||||
|
}
|
||||||
|
|
||||||
|
await downloadCache(archiveLocation, archivePath, options)
|
||||||
|
|
||||||
|
expect(downloadCacheHttpClientMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(downloadCacheHttpClientMock).toHaveBeenCalledWith(
|
||||||
|
archiveLocation,
|
||||||
|
archivePath
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(downloadCacheStorageSDKMock).toHaveBeenCalledTimes(0)
|
||||||
|
})
|
|
@ -0,0 +1,40 @@
|
||||||
|
import {promises as fs} from 'fs'
|
||||||
|
import * as path from 'path'
|
||||||
|
import * as cacheUtils from '../src/internal/cacheUtils'
|
||||||
|
|
||||||
|
test('getArchiveFileSizeInBytes returns file size', () => {
|
||||||
|
const filePath = path.join(__dirname, '__fixtures__', 'helloWorld.txt')
|
||||||
|
|
||||||
|
const size = cacheUtils.getArchiveFileSizeInBytes(filePath)
|
||||||
|
|
||||||
|
expect(size).toBe(11)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('unlinkFile unlinks file', async () => {
|
||||||
|
const testDirectory = await fs.mkdtemp('unlinkFileTest')
|
||||||
|
const testFile = path.join(testDirectory, 'test.txt')
|
||||||
|
await fs.writeFile(testFile, 'hello world')
|
||||||
|
|
||||||
|
await expect(fs.stat(testFile)).resolves.not.toThrow()
|
||||||
|
|
||||||
|
await cacheUtils.unlinkFile(testFile)
|
||||||
|
|
||||||
|
// This should throw as testFile should not exist
|
||||||
|
await expect(fs.stat(testFile)).rejects.toThrow()
|
||||||
|
|
||||||
|
await fs.rmdir(testDirectory)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('assertDefined throws if undefined', () => {
|
||||||
|
expect(() => cacheUtils.assertDefined('test', undefined)).toThrowError()
|
||||||
|
})
|
||||||
|
|
||||||
|
test('assertDefined returns value', () => {
|
||||||
|
expect(cacheUtils.assertDefined('test', 5)).toBe(5)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('resolvePaths works on github workspace directory', async () => {
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE'] ?? '.'
|
||||||
|
const paths = await cacheUtils.resolvePaths([workspace])
|
||||||
|
expect(paths.length).toBeGreaterThan(0)
|
||||||
|
})
|
|
@ -0,0 +1,17 @@
|
||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
# Validate args
|
||||||
|
prefix="$1"
|
||||||
|
if [ -z "$prefix" ]; then
|
||||||
|
echo "Must supply prefix argument"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
path="$2"
|
||||||
|
if [ -z "$path" ]; then
|
||||||
|
echo "Must supply path argument"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
mkdir -p $path
|
||||||
|
echo "$prefix $GITHUB_RUN_ID" > $path/test-file.txt
|
|
@ -0,0 +1,160 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import {DownloadProgress} from '../src/internal/downloadUtils'
|
||||||
|
|
||||||
|
test('download progress tracked correctly', () => {
|
||||||
|
const progress = new DownloadProgress(1000)
|
||||||
|
|
||||||
|
expect(progress.contentLength).toBe(1000)
|
||||||
|
expect(progress.receivedBytes).toBe(0)
|
||||||
|
expect(progress.segmentIndex).toBe(0)
|
||||||
|
expect(progress.segmentOffset).toBe(0)
|
||||||
|
expect(progress.segmentSize).toBe(0)
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
expect(progress.getTransferredBytes()).toBe(0)
|
||||||
|
expect(progress.isDone()).toBe(false)
|
||||||
|
|
||||||
|
progress.nextSegment(500)
|
||||||
|
|
||||||
|
expect(progress.contentLength).toBe(1000)
|
||||||
|
expect(progress.receivedBytes).toBe(0)
|
||||||
|
expect(progress.segmentIndex).toBe(1)
|
||||||
|
expect(progress.segmentOffset).toBe(0)
|
||||||
|
expect(progress.segmentSize).toBe(500)
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
expect(progress.getTransferredBytes()).toBe(0)
|
||||||
|
expect(progress.isDone()).toBe(false)
|
||||||
|
|
||||||
|
progress.setReceivedBytes(250)
|
||||||
|
|
||||||
|
expect(progress.contentLength).toBe(1000)
|
||||||
|
expect(progress.receivedBytes).toBe(250)
|
||||||
|
expect(progress.segmentIndex).toBe(1)
|
||||||
|
expect(progress.segmentOffset).toBe(0)
|
||||||
|
expect(progress.segmentSize).toBe(500)
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
expect(progress.getTransferredBytes()).toBe(250)
|
||||||
|
expect(progress.isDone()).toBe(false)
|
||||||
|
|
||||||
|
progress.setReceivedBytes(500)
|
||||||
|
|
||||||
|
expect(progress.contentLength).toBe(1000)
|
||||||
|
expect(progress.receivedBytes).toBe(500)
|
||||||
|
expect(progress.segmentIndex).toBe(1)
|
||||||
|
expect(progress.segmentOffset).toBe(0)
|
||||||
|
expect(progress.segmentSize).toBe(500)
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
expect(progress.getTransferredBytes()).toBe(500)
|
||||||
|
expect(progress.isDone()).toBe(false)
|
||||||
|
|
||||||
|
progress.nextSegment(500)
|
||||||
|
|
||||||
|
expect(progress.contentLength).toBe(1000)
|
||||||
|
expect(progress.receivedBytes).toBe(0)
|
||||||
|
expect(progress.segmentIndex).toBe(2)
|
||||||
|
expect(progress.segmentOffset).toBe(500)
|
||||||
|
expect(progress.segmentSize).toBe(500)
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
expect(progress.getTransferredBytes()).toBe(500)
|
||||||
|
expect(progress.isDone()).toBe(false)
|
||||||
|
|
||||||
|
progress.setReceivedBytes(250)
|
||||||
|
|
||||||
|
expect(progress.contentLength).toBe(1000)
|
||||||
|
expect(progress.receivedBytes).toBe(250)
|
||||||
|
expect(progress.segmentIndex).toBe(2)
|
||||||
|
expect(progress.segmentOffset).toBe(500)
|
||||||
|
expect(progress.segmentSize).toBe(500)
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
expect(progress.getTransferredBytes()).toBe(750)
|
||||||
|
expect(progress.isDone()).toBe(false)
|
||||||
|
|
||||||
|
progress.setReceivedBytes(500)
|
||||||
|
|
||||||
|
expect(progress.contentLength).toBe(1000)
|
||||||
|
expect(progress.receivedBytes).toBe(500)
|
||||||
|
expect(progress.segmentIndex).toBe(2)
|
||||||
|
expect(progress.segmentOffset).toBe(500)
|
||||||
|
expect(progress.segmentSize).toBe(500)
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
expect(progress.getTransferredBytes()).toBe(1000)
|
||||||
|
expect(progress.isDone()).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('display timer works correctly', done => {
|
||||||
|
const progress = new DownloadProgress(1000)
|
||||||
|
|
||||||
|
const infoMock = jest.spyOn(core, 'info')
|
||||||
|
infoMock.mockImplementation(() => {})
|
||||||
|
|
||||||
|
const check = (): void => {
|
||||||
|
expect(infoMock).toHaveBeenLastCalledWith(
|
||||||
|
expect.stringContaining('Received 500 of 1000')
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate no further updates are displayed after stopping the timer.
|
||||||
|
const test2 = (): void => {
|
||||||
|
check()
|
||||||
|
expect(progress.timeoutHandle).toBeUndefined()
|
||||||
|
done()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate the progress is displayed, stop the timer, and call test2.
|
||||||
|
const test1 = (): void => {
|
||||||
|
check()
|
||||||
|
|
||||||
|
progress.stopDisplayTimer()
|
||||||
|
progress.setReceivedBytes(1000)
|
||||||
|
|
||||||
|
setTimeout(() => test2(), 500)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start the timer, update the received bytes, and call test1.
|
||||||
|
const start = (): void => {
|
||||||
|
progress.startDisplayTimer(10)
|
||||||
|
expect(progress.timeoutHandle).toBeDefined()
|
||||||
|
|
||||||
|
progress.setReceivedBytes(500)
|
||||||
|
|
||||||
|
setTimeout(() => test1(), 500)
|
||||||
|
}
|
||||||
|
|
||||||
|
start()
|
||||||
|
})
|
||||||
|
|
||||||
|
test('display does not print completed line twice', () => {
|
||||||
|
const progress = new DownloadProgress(1000)
|
||||||
|
|
||||||
|
const infoMock = jest.spyOn(core, 'info')
|
||||||
|
infoMock.mockImplementation(() => {})
|
||||||
|
|
||||||
|
progress.display()
|
||||||
|
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(infoMock).toHaveBeenCalledTimes(1)
|
||||||
|
|
||||||
|
progress.nextSegment(1000)
|
||||||
|
progress.setReceivedBytes(500)
|
||||||
|
progress.display()
|
||||||
|
|
||||||
|
expect(progress.displayedComplete).toBe(false)
|
||||||
|
expect(infoMock).toHaveBeenCalledTimes(2)
|
||||||
|
|
||||||
|
progress.setReceivedBytes(1000)
|
||||||
|
progress.display()
|
||||||
|
|
||||||
|
expect(progress.displayedComplete).toBe(true)
|
||||||
|
expect(infoMock).toHaveBeenCalledTimes(3)
|
||||||
|
|
||||||
|
progress.display()
|
||||||
|
|
||||||
|
expect(progress.displayedComplete).toBe(true)
|
||||||
|
expect(infoMock).toHaveBeenCalledTimes(3)
|
||||||
|
})
|
|
@ -0,0 +1,83 @@
|
||||||
|
import {
|
||||||
|
DownloadOptions,
|
||||||
|
UploadOptions,
|
||||||
|
getDownloadOptions,
|
||||||
|
getUploadOptions
|
||||||
|
} from '../src/options'
|
||||||
|
|
||||||
|
const useAzureSdk = false
|
||||||
|
const concurrentBlobDownloads = true
|
||||||
|
const downloadConcurrency = 8
|
||||||
|
const timeoutInMs = 30000
|
||||||
|
const segmentTimeoutInMs = 600000
|
||||||
|
const lookupOnly = false
|
||||||
|
const uploadConcurrency = 4
|
||||||
|
const uploadChunkSize = 32 * 1024 * 1024
|
||||||
|
|
||||||
|
test('getDownloadOptions sets defaults', async () => {
|
||||||
|
const actualOptions = getDownloadOptions()
|
||||||
|
|
||||||
|
expect(actualOptions).toEqual({
|
||||||
|
useAzureSdk,
|
||||||
|
concurrentBlobDownloads,
|
||||||
|
downloadConcurrency,
|
||||||
|
timeoutInMs,
|
||||||
|
segmentTimeoutInMs,
|
||||||
|
lookupOnly
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getDownloadOptions overrides all settings', async () => {
|
||||||
|
const expectedOptions: DownloadOptions = {
|
||||||
|
useAzureSdk: true,
|
||||||
|
concurrentBlobDownloads: false,
|
||||||
|
downloadConcurrency: 14,
|
||||||
|
timeoutInMs: 20000,
|
||||||
|
segmentTimeoutInMs: 3600000,
|
||||||
|
lookupOnly: true
|
||||||
|
}
|
||||||
|
|
||||||
|
const actualOptions = getDownloadOptions(expectedOptions)
|
||||||
|
|
||||||
|
expect(actualOptions).toEqual(expectedOptions)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getUploadOptions sets defaults', async () => {
|
||||||
|
const actualOptions = getUploadOptions()
|
||||||
|
|
||||||
|
expect(actualOptions).toEqual({
|
||||||
|
uploadConcurrency,
|
||||||
|
uploadChunkSize
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getUploadOptions overrides all settings', async () => {
|
||||||
|
const expectedOptions: UploadOptions = {
|
||||||
|
uploadConcurrency: 2,
|
||||||
|
uploadChunkSize: 16 * 1024 * 1024
|
||||||
|
}
|
||||||
|
|
||||||
|
const actualOptions = getUploadOptions(expectedOptions)
|
||||||
|
|
||||||
|
expect(actualOptions).toEqual(expectedOptions)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('getDownloadOptions overrides download timeout minutes', async () => {
|
||||||
|
const expectedOptions: DownloadOptions = {
|
||||||
|
useAzureSdk: false,
|
||||||
|
downloadConcurrency: 14,
|
||||||
|
timeoutInMs: 20000,
|
||||||
|
segmentTimeoutInMs: 3600000,
|
||||||
|
lookupOnly: true
|
||||||
|
}
|
||||||
|
process.env.SEGMENT_DOWNLOAD_TIMEOUT_MINS = '10'
|
||||||
|
const actualOptions = getDownloadOptions(expectedOptions)
|
||||||
|
|
||||||
|
expect(actualOptions.useAzureSdk).toEqual(expectedOptions.useAzureSdk)
|
||||||
|
expect(actualOptions.downloadConcurrency).toEqual(
|
||||||
|
expectedOptions.downloadConcurrency
|
||||||
|
)
|
||||||
|
expect(actualOptions.timeoutInMs).toEqual(expectedOptions.timeoutInMs)
|
||||||
|
expect(actualOptions.segmentTimeoutInMs).toEqual(600000)
|
||||||
|
expect(actualOptions.lookupOnly).toEqual(expectedOptions.lookupOnly)
|
||||||
|
})
|
|
@ -0,0 +1,179 @@
|
||||||
|
import {retry, retryTypedResponse} from '../src/internal/requestUtils'
|
||||||
|
import {HttpClientError} from '@actions/http-client'
|
||||||
|
import * as requestUtils from '../src/internal/requestUtils'
|
||||||
|
|
||||||
|
interface ITestResponse {
|
||||||
|
statusCode: number
|
||||||
|
result: string | null
|
||||||
|
error: Error | null
|
||||||
|
}
|
||||||
|
|
||||||
|
function TestResponse(
|
||||||
|
action: number | Error,
|
||||||
|
result: string | null = null
|
||||||
|
): ITestResponse {
|
||||||
|
if (action instanceof Error) {
|
||||||
|
return {
|
||||||
|
statusCode: -1,
|
||||||
|
result,
|
||||||
|
error: action
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return {
|
||||||
|
statusCode: action,
|
||||||
|
result,
|
||||||
|
error: null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function handleResponse(
|
||||||
|
response: ITestResponse | undefined
|
||||||
|
): Promise<ITestResponse> {
|
||||||
|
if (!response) {
|
||||||
|
fail('Retry method called too many times')
|
||||||
|
}
|
||||||
|
|
||||||
|
if (response.error) {
|
||||||
|
throw response.error
|
||||||
|
} else {
|
||||||
|
return Promise.resolve(response)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function testRetryExpectingResult(
|
||||||
|
responses: ITestResponse[],
|
||||||
|
expectedResult: string | null
|
||||||
|
): Promise<void> {
|
||||||
|
responses = responses.reverse() // Reverse responses since we pop from end
|
||||||
|
|
||||||
|
const actualResult = await retry(
|
||||||
|
'test',
|
||||||
|
async () => handleResponse(responses.pop()),
|
||||||
|
(response: ITestResponse) => response.statusCode,
|
||||||
|
2, // maxAttempts
|
||||||
|
0 // delay
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(actualResult.result).toEqual(expectedResult)
|
||||||
|
}
|
||||||
|
|
||||||
|
async function testRetryConvertingErrorToResult(
|
||||||
|
responses: ITestResponse[],
|
||||||
|
expectedStatus: number,
|
||||||
|
expectedResult: string | null
|
||||||
|
): Promise<void> {
|
||||||
|
responses = responses.reverse() // Reverse responses since we pop from end
|
||||||
|
|
||||||
|
const actualResult = await retry(
|
||||||
|
'test',
|
||||||
|
async () => handleResponse(responses.pop()),
|
||||||
|
(response: ITestResponse) => response.statusCode,
|
||||||
|
2, // maxAttempts
|
||||||
|
0, // delay
|
||||||
|
(e: Error) => {
|
||||||
|
if (e instanceof HttpClientError) {
|
||||||
|
return {
|
||||||
|
statusCode: e.statusCode,
|
||||||
|
result: null,
|
||||||
|
error: null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(actualResult.statusCode).toEqual(expectedStatus)
|
||||||
|
expect(actualResult.result).toEqual(expectedResult)
|
||||||
|
}
|
||||||
|
|
||||||
|
async function testRetryExpectingError(
|
||||||
|
responses: ITestResponse[]
|
||||||
|
): Promise<void> {
|
||||||
|
responses = responses.reverse() // Reverse responses since we pop from end
|
||||||
|
|
||||||
|
expect(
|
||||||
|
retry(
|
||||||
|
'test',
|
||||||
|
async () => handleResponse(responses.pop()),
|
||||||
|
(response: ITestResponse) => response.statusCode,
|
||||||
|
2, // maxAttempts,
|
||||||
|
0 // delay
|
||||||
|
)
|
||||||
|
).rejects.toBeInstanceOf(Error)
|
||||||
|
}
|
||||||
|
|
||||||
|
test('retry works on successful response', async () => {
|
||||||
|
await testRetryExpectingResult([TestResponse(200, 'Ok')], 'Ok')
|
||||||
|
})
|
||||||
|
|
||||||
|
test('retry works after retryable status code', async () => {
|
||||||
|
await testRetryExpectingResult(
|
||||||
|
[TestResponse(503), TestResponse(200, 'Ok')],
|
||||||
|
'Ok'
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('retry fails after exhausting retries', async () => {
|
||||||
|
await testRetryExpectingError([
|
||||||
|
TestResponse(503),
|
||||||
|
TestResponse(503),
|
||||||
|
TestResponse(200, 'Ok')
|
||||||
|
])
|
||||||
|
})
|
||||||
|
|
||||||
|
test('retry fails after non-retryable status code', async () => {
|
||||||
|
await testRetryExpectingError([TestResponse(500), TestResponse(200, 'Ok')])
|
||||||
|
})
|
||||||
|
|
||||||
|
test('retry works after error', async () => {
|
||||||
|
await testRetryExpectingResult(
|
||||||
|
[TestResponse(new Error('Test error')), TestResponse(200, 'Ok')],
|
||||||
|
'Ok'
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('retry returns after client error', async () => {
|
||||||
|
await testRetryExpectingResult(
|
||||||
|
[TestResponse(400), TestResponse(200, 'Ok')],
|
||||||
|
null
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('retry converts errors to response object', async () => {
|
||||||
|
await testRetryConvertingErrorToResult(
|
||||||
|
[TestResponse(new HttpClientError('Test error', 409))],
|
||||||
|
409,
|
||||||
|
null
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('retryTypedResponse gives an error with error message', async () => {
|
||||||
|
const httpClientError = new HttpClientError(
|
||||||
|
'The cache filesize must be between 0 and 10 * 1024 * 1024 bytes',
|
||||||
|
400
|
||||||
|
)
|
||||||
|
jest.spyOn(requestUtils, 'retry').mockReturnValue(
|
||||||
|
new Promise(resolve => {
|
||||||
|
resolve(httpClientError)
|
||||||
|
})
|
||||||
|
)
|
||||||
|
try {
|
||||||
|
await retryTypedResponse<string>(
|
||||||
|
'reserveCache',
|
||||||
|
async () =>
|
||||||
|
new Promise(resolve => {
|
||||||
|
resolve({
|
||||||
|
statusCode: 400,
|
||||||
|
result: '',
|
||||||
|
headers: {},
|
||||||
|
error: httpClientError
|
||||||
|
})
|
||||||
|
})
|
||||||
|
)
|
||||||
|
} catch (error) {
|
||||||
|
expect(error).toHaveProperty(
|
||||||
|
'message',
|
||||||
|
'The cache filesize must be between 0 and 10 * 1024 * 1024 bytes'
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
|
@ -0,0 +1,314 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import * as path from 'path'
|
||||||
|
import {restoreCache} from '../src/cache'
|
||||||
|
import * as cacheHttpClient from '../src/internal/cacheHttpClient'
|
||||||
|
import * as cacheUtils from '../src/internal/cacheUtils'
|
||||||
|
import {CacheFilename, CompressionMethod} from '../src/internal/constants'
|
||||||
|
import {ArtifactCacheEntry} from '../src/internal/contracts'
|
||||||
|
import * as tar from '../src/internal/tar'
|
||||||
|
|
||||||
|
jest.mock('../src/internal/cacheHttpClient')
|
||||||
|
jest.mock('../src/internal/cacheUtils')
|
||||||
|
jest.mock('../src/internal/tar')
|
||||||
|
|
||||||
|
beforeAll(() => {
|
||||||
|
jest.spyOn(console, 'log').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'debug').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'info').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'warning').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'error').mockImplementation(() => {})
|
||||||
|
|
||||||
|
jest.spyOn(cacheUtils, 'getCacheFileName').mockImplementation(cm => {
|
||||||
|
const actualUtils = jest.requireActual('../src/internal/cacheUtils')
|
||||||
|
return actualUtils.getCacheFileName(cm)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with no path should fail', async () => {
|
||||||
|
const paths: string[] = []
|
||||||
|
const key = 'node-test'
|
||||||
|
await expect(restoreCache(paths, key)).rejects.toThrowError(
|
||||||
|
`Path Validation Error: At least one directory or file path is required`
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with too many keys should fail', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
const restoreKeys = [...Array(20).keys()].map(x => x.toString())
|
||||||
|
await expect(restoreCache(paths, key, restoreKeys)).rejects.toThrowError(
|
||||||
|
`Key Validation Error: Keys are limited to a maximum of 10.`
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with large key should fail', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'foo'.repeat(512) // Over the 512 character limit
|
||||||
|
await expect(restoreCache(paths, key)).rejects.toThrowError(
|
||||||
|
`Key Validation Error: ${key} cannot be larger than 512 characters.`
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with invalid key should fail', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'comma,comma'
|
||||||
|
await expect(restoreCache(paths, key)).rejects.toThrowError(
|
||||||
|
`Key Validation Error: ${key} cannot contain commas.`
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with no cache found', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
|
||||||
|
jest.spyOn(cacheHttpClient, 'getCacheEntry').mockImplementation(async () => {
|
||||||
|
return Promise.resolve(null)
|
||||||
|
})
|
||||||
|
|
||||||
|
const cacheKey = await restoreCache(paths, key)
|
||||||
|
|
||||||
|
expect(cacheKey).toBe(undefined)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with server error should fail', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
const logWarningMock = jest.spyOn(core, 'warning')
|
||||||
|
|
||||||
|
jest.spyOn(cacheHttpClient, 'getCacheEntry').mockImplementation(() => {
|
||||||
|
throw new Error('HTTP Error Occurred')
|
||||||
|
})
|
||||||
|
|
||||||
|
const cacheKey = await restoreCache(paths, key)
|
||||||
|
expect(cacheKey).toBe(undefined)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledWith(
|
||||||
|
'Failed to restore: HTTP Error Occurred'
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with restore keys and no cache found', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
const restoreKey = 'node-'
|
||||||
|
|
||||||
|
jest.spyOn(cacheHttpClient, 'getCacheEntry').mockImplementation(async () => {
|
||||||
|
return Promise.resolve(null)
|
||||||
|
})
|
||||||
|
|
||||||
|
const cacheKey = await restoreCache(paths, key, [restoreKey])
|
||||||
|
|
||||||
|
expect(cacheKey).toBe(undefined)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with gzip compressed cache found', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
|
||||||
|
const cacheEntry: ArtifactCacheEntry = {
|
||||||
|
cacheKey: key,
|
||||||
|
scope: 'refs/heads/main',
|
||||||
|
archiveLocation: 'www.actionscache.test/download'
|
||||||
|
}
|
||||||
|
const getCacheMock = jest.spyOn(cacheHttpClient, 'getCacheEntry')
|
||||||
|
getCacheMock.mockImplementation(async () => {
|
||||||
|
return Promise.resolve(cacheEntry)
|
||||||
|
})
|
||||||
|
|
||||||
|
const tempPath = '/foo/bar'
|
||||||
|
|
||||||
|
const createTempDirectoryMock = jest.spyOn(cacheUtils, 'createTempDirectory')
|
||||||
|
createTempDirectoryMock.mockImplementation(async () => {
|
||||||
|
return Promise.resolve(tempPath)
|
||||||
|
})
|
||||||
|
|
||||||
|
const archivePath = path.join(tempPath, CacheFilename.Gzip)
|
||||||
|
const downloadCacheMock = jest.spyOn(cacheHttpClient, 'downloadCache')
|
||||||
|
|
||||||
|
const fileSize = 142
|
||||||
|
const getArchiveFileSizeInBytesMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getArchiveFileSizeInBytes')
|
||||||
|
.mockReturnValue(fileSize)
|
||||||
|
|
||||||
|
const extractTarMock = jest.spyOn(tar, 'extractTar')
|
||||||
|
const unlinkFileMock = jest.spyOn(cacheUtils, 'unlinkFile')
|
||||||
|
|
||||||
|
const compression = CompressionMethod.Gzip
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValue(Promise.resolve(compression))
|
||||||
|
|
||||||
|
const cacheKey = await restoreCache(paths, key)
|
||||||
|
|
||||||
|
expect(cacheKey).toBe(key)
|
||||||
|
expect(getCacheMock).toHaveBeenCalledWith([key], paths, {
|
||||||
|
compressionMethod: compression,
|
||||||
|
enableCrossOsArchive: false
|
||||||
|
})
|
||||||
|
expect(createTempDirectoryMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(downloadCacheMock).toHaveBeenCalledWith(
|
||||||
|
cacheEntry.archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
undefined
|
||||||
|
)
|
||||||
|
expect(getArchiveFileSizeInBytesMock).toHaveBeenCalledWith(archivePath)
|
||||||
|
|
||||||
|
expect(extractTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(extractTarMock).toHaveBeenCalledWith(archivePath, compression)
|
||||||
|
|
||||||
|
expect(unlinkFileMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(unlinkFileMock).toHaveBeenCalledWith(archivePath)
|
||||||
|
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with zstd compressed cache found', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
|
||||||
|
const infoMock = jest.spyOn(core, 'info')
|
||||||
|
|
||||||
|
const cacheEntry: ArtifactCacheEntry = {
|
||||||
|
cacheKey: key,
|
||||||
|
scope: 'refs/heads/main',
|
||||||
|
archiveLocation: 'www.actionscache.test/download'
|
||||||
|
}
|
||||||
|
const getCacheMock = jest.spyOn(cacheHttpClient, 'getCacheEntry')
|
||||||
|
getCacheMock.mockImplementation(async () => {
|
||||||
|
return Promise.resolve(cacheEntry)
|
||||||
|
})
|
||||||
|
const tempPath = '/foo/bar'
|
||||||
|
|
||||||
|
const createTempDirectoryMock = jest.spyOn(cacheUtils, 'createTempDirectory')
|
||||||
|
createTempDirectoryMock.mockImplementation(async () => {
|
||||||
|
return Promise.resolve(tempPath)
|
||||||
|
})
|
||||||
|
|
||||||
|
const archivePath = path.join(tempPath, CacheFilename.Zstd)
|
||||||
|
const downloadCacheMock = jest.spyOn(cacheHttpClient, 'downloadCache')
|
||||||
|
|
||||||
|
const fileSize = 62915000
|
||||||
|
const getArchiveFileSizeInBytesMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getArchiveFileSizeInBytes')
|
||||||
|
.mockReturnValue(fileSize)
|
||||||
|
|
||||||
|
const extractTarMock = jest.spyOn(tar, 'extractTar')
|
||||||
|
const compression = CompressionMethod.Zstd
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValue(Promise.resolve(compression))
|
||||||
|
|
||||||
|
const cacheKey = await restoreCache(paths, key)
|
||||||
|
|
||||||
|
expect(cacheKey).toBe(key)
|
||||||
|
expect(getCacheMock).toHaveBeenCalledWith([key], paths, {
|
||||||
|
compressionMethod: compression,
|
||||||
|
enableCrossOsArchive: false
|
||||||
|
})
|
||||||
|
expect(createTempDirectoryMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(downloadCacheMock).toHaveBeenCalledWith(
|
||||||
|
cacheEntry.archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
undefined
|
||||||
|
)
|
||||||
|
expect(getArchiveFileSizeInBytesMock).toHaveBeenCalledWith(archivePath)
|
||||||
|
expect(infoMock).toHaveBeenCalledWith(`Cache Size: ~60 MB (62915000 B)`)
|
||||||
|
|
||||||
|
expect(extractTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(extractTarMock).toHaveBeenCalledWith(archivePath, compression)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with cache found for restore key', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
const restoreKey = 'node-'
|
||||||
|
|
||||||
|
const infoMock = jest.spyOn(core, 'info')
|
||||||
|
|
||||||
|
const cacheEntry: ArtifactCacheEntry = {
|
||||||
|
cacheKey: restoreKey,
|
||||||
|
scope: 'refs/heads/main',
|
||||||
|
archiveLocation: 'www.actionscache.test/download'
|
||||||
|
}
|
||||||
|
const getCacheMock = jest.spyOn(cacheHttpClient, 'getCacheEntry')
|
||||||
|
getCacheMock.mockImplementation(async () => {
|
||||||
|
return Promise.resolve(cacheEntry)
|
||||||
|
})
|
||||||
|
const tempPath = '/foo/bar'
|
||||||
|
|
||||||
|
const createTempDirectoryMock = jest.spyOn(cacheUtils, 'createTempDirectory')
|
||||||
|
createTempDirectoryMock.mockImplementation(async () => {
|
||||||
|
return Promise.resolve(tempPath)
|
||||||
|
})
|
||||||
|
|
||||||
|
const archivePath = path.join(tempPath, CacheFilename.Zstd)
|
||||||
|
const downloadCacheMock = jest.spyOn(cacheHttpClient, 'downloadCache')
|
||||||
|
|
||||||
|
const fileSize = 142
|
||||||
|
const getArchiveFileSizeInBytesMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getArchiveFileSizeInBytes')
|
||||||
|
.mockReturnValue(fileSize)
|
||||||
|
|
||||||
|
const extractTarMock = jest.spyOn(tar, 'extractTar')
|
||||||
|
const compression = CompressionMethod.Zstd
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValue(Promise.resolve(compression))
|
||||||
|
|
||||||
|
const cacheKey = await restoreCache(paths, key, [restoreKey])
|
||||||
|
|
||||||
|
expect(cacheKey).toBe(restoreKey)
|
||||||
|
expect(getCacheMock).toHaveBeenCalledWith([key, restoreKey], paths, {
|
||||||
|
compressionMethod: compression,
|
||||||
|
enableCrossOsArchive: false
|
||||||
|
})
|
||||||
|
expect(createTempDirectoryMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(downloadCacheMock).toHaveBeenCalledWith(
|
||||||
|
cacheEntry.archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
undefined
|
||||||
|
)
|
||||||
|
expect(getArchiveFileSizeInBytesMock).toHaveBeenCalledWith(archivePath)
|
||||||
|
expect(infoMock).toHaveBeenCalledWith(`Cache Size: ~0 MB (142 B)`)
|
||||||
|
|
||||||
|
expect(extractTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(extractTarMock).toHaveBeenCalledWith(archivePath, compression)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('restore with dry run', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const key = 'node-test'
|
||||||
|
const options = {lookupOnly: true}
|
||||||
|
|
||||||
|
const cacheEntry: ArtifactCacheEntry = {
|
||||||
|
cacheKey: key,
|
||||||
|
scope: 'refs/heads/main',
|
||||||
|
archiveLocation: 'www.actionscache.test/download'
|
||||||
|
}
|
||||||
|
const getCacheMock = jest.spyOn(cacheHttpClient, 'getCacheEntry')
|
||||||
|
getCacheMock.mockImplementation(async () => {
|
||||||
|
return Promise.resolve(cacheEntry)
|
||||||
|
})
|
||||||
|
|
||||||
|
const createTempDirectoryMock = jest.spyOn(cacheUtils, 'createTempDirectory')
|
||||||
|
const downloadCacheMock = jest.spyOn(cacheHttpClient, 'downloadCache')
|
||||||
|
|
||||||
|
const compression = CompressionMethod.Gzip
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValue(Promise.resolve(compression))
|
||||||
|
|
||||||
|
const cacheKey = await restoreCache(paths, key, undefined, options)
|
||||||
|
|
||||||
|
expect(cacheKey).toBe(key)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(getCacheMock).toHaveBeenCalledWith([key], paths, {
|
||||||
|
compressionMethod: compression,
|
||||||
|
enableCrossOsArchive: false
|
||||||
|
})
|
||||||
|
// creating a tempDir and downloading the cache are skipped
|
||||||
|
expect(createTempDirectoryMock).toHaveBeenCalledTimes(0)
|
||||||
|
expect(downloadCacheMock).toHaveBeenCalledTimes(0)
|
||||||
|
})
|
|
@ -0,0 +1,329 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import * as path from 'path'
|
||||||
|
import {saveCache} from '../src/cache'
|
||||||
|
import * as cacheHttpClient from '../src/internal/cacheHttpClient'
|
||||||
|
import * as cacheUtils from '../src/internal/cacheUtils'
|
||||||
|
import {CacheFilename, CompressionMethod} from '../src/internal/constants'
|
||||||
|
import * as tar from '../src/internal/tar'
|
||||||
|
import {TypedResponse} from '@actions/http-client/lib/interfaces'
|
||||||
|
import {
|
||||||
|
ReserveCacheResponse,
|
||||||
|
ITypedResponseWithError
|
||||||
|
} from '../src/internal/contracts'
|
||||||
|
import {HttpClientError} from '@actions/http-client'
|
||||||
|
|
||||||
|
jest.mock('../src/internal/cacheHttpClient')
|
||||||
|
jest.mock('../src/internal/cacheUtils')
|
||||||
|
jest.mock('../src/internal/tar')
|
||||||
|
|
||||||
|
beforeAll(() => {
|
||||||
|
jest.spyOn(console, 'log').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'debug').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'info').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'warning').mockImplementation(() => {})
|
||||||
|
jest.spyOn(core, 'error').mockImplementation(() => {})
|
||||||
|
jest.spyOn(cacheUtils, 'getCacheFileName').mockImplementation(cm => {
|
||||||
|
const actualUtils = jest.requireActual('../src/internal/cacheUtils')
|
||||||
|
return actualUtils.getCacheFileName(cm)
|
||||||
|
})
|
||||||
|
jest.spyOn(cacheUtils, 'resolvePaths').mockImplementation(async filePaths => {
|
||||||
|
return filePaths.map(x => path.resolve(x))
|
||||||
|
})
|
||||||
|
jest.spyOn(cacheUtils, 'createTempDirectory').mockImplementation(async () => {
|
||||||
|
return Promise.resolve('/foo/bar')
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with missing input should fail', async () => {
|
||||||
|
const paths: string[] = []
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
await expect(saveCache(paths, primaryKey)).rejects.toThrowError(
|
||||||
|
`Path Validation Error: At least one directory or file path is required`
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with large cache outputs should fail', async () => {
|
||||||
|
const filePath = 'node_modules'
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
const cachePaths = [path.resolve(filePath)]
|
||||||
|
|
||||||
|
const createTarMock = jest.spyOn(tar, 'createTar')
|
||||||
|
const logWarningMock = jest.spyOn(core, 'warning')
|
||||||
|
|
||||||
|
const cacheSize = 11 * 1024 * 1024 * 1024 //~11GB, over the 10GB limit
|
||||||
|
jest
|
||||||
|
.spyOn(cacheUtils, 'getArchiveFileSizeInBytes')
|
||||||
|
.mockReturnValueOnce(cacheSize)
|
||||||
|
const compression = CompressionMethod.Gzip
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValueOnce(Promise.resolve(compression))
|
||||||
|
|
||||||
|
const cacheId = await saveCache([filePath], primaryKey)
|
||||||
|
expect(cacheId).toBe(-1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledWith(
|
||||||
|
'Failed to save: Cache size of ~11264 MB (11811160064 B) is over the 10GB limit, not saving cache.'
|
||||||
|
)
|
||||||
|
|
||||||
|
const archiveFolder = '/foo/bar'
|
||||||
|
|
||||||
|
expect(createTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(createTarMock).toHaveBeenCalledWith(
|
||||||
|
archiveFolder,
|
||||||
|
cachePaths,
|
||||||
|
compression
|
||||||
|
)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with large cache outputs should fail in GHES with error message', async () => {
|
||||||
|
const filePath = 'node_modules'
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
const cachePaths = [path.resolve(filePath)]
|
||||||
|
|
||||||
|
const createTarMock = jest.spyOn(tar, 'createTar')
|
||||||
|
const logWarningMock = jest.spyOn(core, 'warning')
|
||||||
|
|
||||||
|
const cacheSize = 11 * 1024 * 1024 * 1024 //~11GB, over the 10GB limit
|
||||||
|
jest
|
||||||
|
.spyOn(cacheUtils, 'getArchiveFileSizeInBytes')
|
||||||
|
.mockReturnValueOnce(cacheSize)
|
||||||
|
const compression = CompressionMethod.Gzip
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValueOnce(Promise.resolve(compression))
|
||||||
|
|
||||||
|
jest.spyOn(cacheUtils, 'isGhes').mockReturnValueOnce(true)
|
||||||
|
|
||||||
|
const reserveCacheMock = jest
|
||||||
|
.spyOn(cacheHttpClient, 'reserveCache')
|
||||||
|
.mockImplementation(async () => {
|
||||||
|
const response: ITypedResponseWithError<ReserveCacheResponse> = {
|
||||||
|
statusCode: 400,
|
||||||
|
result: null,
|
||||||
|
headers: {},
|
||||||
|
error: new HttpClientError(
|
||||||
|
'The cache filesize must be between 0 and 1073741824 bytes',
|
||||||
|
400
|
||||||
|
)
|
||||||
|
}
|
||||||
|
return response
|
||||||
|
})
|
||||||
|
|
||||||
|
const cacheId = await saveCache([filePath], primaryKey)
|
||||||
|
expect(cacheId).toBe(-1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledWith(
|
||||||
|
'Failed to save: The cache filesize must be between 0 and 1073741824 bytes'
|
||||||
|
)
|
||||||
|
|
||||||
|
const archiveFolder = '/foo/bar'
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(createTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(createTarMock).toHaveBeenCalledWith(
|
||||||
|
archiveFolder,
|
||||||
|
cachePaths,
|
||||||
|
compression
|
||||||
|
)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with large cache outputs should fail in GHES without error message', async () => {
|
||||||
|
const filePath = 'node_modules'
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
const cachePaths = [path.resolve(filePath)]
|
||||||
|
|
||||||
|
const createTarMock = jest.spyOn(tar, 'createTar')
|
||||||
|
const logWarningMock = jest.spyOn(core, 'warning')
|
||||||
|
|
||||||
|
const cacheSize = 11 * 1024 * 1024 * 1024 //~11GB, over the 10GB limit
|
||||||
|
jest
|
||||||
|
.spyOn(cacheUtils, 'getArchiveFileSizeInBytes')
|
||||||
|
.mockReturnValueOnce(cacheSize)
|
||||||
|
const compression = CompressionMethod.Gzip
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValueOnce(Promise.resolve(compression))
|
||||||
|
|
||||||
|
jest.spyOn(cacheUtils, 'isGhes').mockReturnValueOnce(true)
|
||||||
|
|
||||||
|
const reserveCacheMock = jest
|
||||||
|
.spyOn(cacheHttpClient, 'reserveCache')
|
||||||
|
.mockImplementation(async () => {
|
||||||
|
const response: ITypedResponseWithError<ReserveCacheResponse> = {
|
||||||
|
statusCode: 400,
|
||||||
|
result: null,
|
||||||
|
headers: {}
|
||||||
|
}
|
||||||
|
return response
|
||||||
|
})
|
||||||
|
|
||||||
|
const cacheId = await saveCache([filePath], primaryKey)
|
||||||
|
expect(cacheId).toBe(-1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledWith(
|
||||||
|
'Failed to save: Cache size of ~11264 MB (11811160064 B) is over the data cap limit, not saving cache.'
|
||||||
|
)
|
||||||
|
|
||||||
|
const archiveFolder = '/foo/bar'
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(createTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(createTarMock).toHaveBeenCalledWith(
|
||||||
|
archiveFolder,
|
||||||
|
cachePaths,
|
||||||
|
compression
|
||||||
|
)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with reserve cache failure should fail', async () => {
|
||||||
|
const paths = ['node_modules']
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
const logInfoMock = jest.spyOn(core, 'info')
|
||||||
|
|
||||||
|
const reserveCacheMock = jest
|
||||||
|
.spyOn(cacheHttpClient, 'reserveCache')
|
||||||
|
.mockImplementation(async () => {
|
||||||
|
const response: TypedResponse<ReserveCacheResponse> = {
|
||||||
|
statusCode: 500,
|
||||||
|
result: null,
|
||||||
|
headers: {}
|
||||||
|
}
|
||||||
|
return response
|
||||||
|
})
|
||||||
|
|
||||||
|
const createTarMock = jest.spyOn(tar, 'createTar')
|
||||||
|
const saveCacheMock = jest.spyOn(cacheHttpClient, 'saveCache')
|
||||||
|
const compression = CompressionMethod.Zstd
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValueOnce(Promise.resolve(compression))
|
||||||
|
|
||||||
|
const cacheId = await saveCache(paths, primaryKey)
|
||||||
|
expect(cacheId).toBe(-1)
|
||||||
|
expect(logInfoMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(logInfoMock).toHaveBeenCalledWith(
|
||||||
|
`Failed to save: Unable to reserve cache with key ${primaryKey}, another job may be creating this cache. More details: undefined`
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledWith(primaryKey, paths, {
|
||||||
|
cacheSize: undefined,
|
||||||
|
compressionMethod: compression,
|
||||||
|
enableCrossOsArchive: false
|
||||||
|
})
|
||||||
|
expect(createTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(saveCacheMock).toHaveBeenCalledTimes(0)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with server error should fail', async () => {
|
||||||
|
const filePath = 'node_modules'
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
const cachePaths = [path.resolve(filePath)]
|
||||||
|
const logWarningMock = jest.spyOn(core, 'warning')
|
||||||
|
const cacheId = 4
|
||||||
|
const reserveCacheMock = jest
|
||||||
|
.spyOn(cacheHttpClient, 'reserveCache')
|
||||||
|
.mockImplementation(async () => {
|
||||||
|
const response: TypedResponse<ReserveCacheResponse> = {
|
||||||
|
statusCode: 500,
|
||||||
|
result: {cacheId},
|
||||||
|
headers: {}
|
||||||
|
}
|
||||||
|
return response
|
||||||
|
})
|
||||||
|
|
||||||
|
const createTarMock = jest.spyOn(tar, 'createTar')
|
||||||
|
|
||||||
|
const saveCacheMock = jest
|
||||||
|
.spyOn(cacheHttpClient, 'saveCache')
|
||||||
|
.mockImplementationOnce(() => {
|
||||||
|
throw new Error('HTTP Error Occurred')
|
||||||
|
})
|
||||||
|
const compression = CompressionMethod.Zstd
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValueOnce(Promise.resolve(compression))
|
||||||
|
|
||||||
|
await saveCache([filePath], primaryKey)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(logWarningMock).toHaveBeenCalledWith(
|
||||||
|
'Failed to save: HTTP Error Occurred'
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledWith(primaryKey, [filePath], {
|
||||||
|
cacheSize: undefined,
|
||||||
|
compressionMethod: compression,
|
||||||
|
enableCrossOsArchive: false
|
||||||
|
})
|
||||||
|
const archiveFolder = '/foo/bar'
|
||||||
|
const archiveFile = path.join(archiveFolder, CacheFilename.Zstd)
|
||||||
|
expect(createTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(createTarMock).toHaveBeenCalledWith(
|
||||||
|
archiveFolder,
|
||||||
|
cachePaths,
|
||||||
|
compression
|
||||||
|
)
|
||||||
|
expect(saveCacheMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(saveCacheMock).toHaveBeenCalledWith(cacheId, archiveFile, undefined)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with valid inputs uploads a cache', async () => {
|
||||||
|
const filePath = 'node_modules'
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
const cachePaths = [path.resolve(filePath)]
|
||||||
|
|
||||||
|
const cacheId = 4
|
||||||
|
const reserveCacheMock = jest
|
||||||
|
.spyOn(cacheHttpClient, 'reserveCache')
|
||||||
|
.mockImplementation(async () => {
|
||||||
|
const response: TypedResponse<ReserveCacheResponse> = {
|
||||||
|
statusCode: 500,
|
||||||
|
result: {cacheId},
|
||||||
|
headers: {}
|
||||||
|
}
|
||||||
|
return response
|
||||||
|
})
|
||||||
|
const createTarMock = jest.spyOn(tar, 'createTar')
|
||||||
|
|
||||||
|
const saveCacheMock = jest.spyOn(cacheHttpClient, 'saveCache')
|
||||||
|
const compression = CompressionMethod.Zstd
|
||||||
|
const getCompressionMock = jest
|
||||||
|
.spyOn(cacheUtils, 'getCompressionMethod')
|
||||||
|
.mockReturnValue(Promise.resolve(compression))
|
||||||
|
|
||||||
|
await saveCache([filePath], primaryKey)
|
||||||
|
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(reserveCacheMock).toHaveBeenCalledWith(primaryKey, [filePath], {
|
||||||
|
cacheSize: undefined,
|
||||||
|
compressionMethod: compression,
|
||||||
|
enableCrossOsArchive: false
|
||||||
|
})
|
||||||
|
const archiveFolder = '/foo/bar'
|
||||||
|
const archiveFile = path.join(archiveFolder, CacheFilename.Zstd)
|
||||||
|
expect(createTarMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(createTarMock).toHaveBeenCalledWith(
|
||||||
|
archiveFolder,
|
||||||
|
cachePaths,
|
||||||
|
compression
|
||||||
|
)
|
||||||
|
expect(saveCacheMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(saveCacheMock).toHaveBeenCalledWith(cacheId, archiveFile, undefined)
|
||||||
|
expect(getCompressionMock).toHaveBeenCalledTimes(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('save with non existing path should not save cache', async () => {
|
||||||
|
const path = 'node_modules'
|
||||||
|
const primaryKey = 'Linux-node-bb828da54c148048dd17899ba9fda624811cfb43'
|
||||||
|
jest.spyOn(cacheUtils, 'resolvePaths').mockImplementation(async () => {
|
||||||
|
return []
|
||||||
|
})
|
||||||
|
await expect(saveCache([path], primaryKey)).rejects.toThrowError(
|
||||||
|
`Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.`
|
||||||
|
)
|
||||||
|
})
|
|
@ -0,0 +1,482 @@
|
||||||
|
import * as exec from '@actions/exec'
|
||||||
|
import * as io from '@actions/io'
|
||||||
|
import * as path from 'path'
|
||||||
|
import {
|
||||||
|
CacheFilename,
|
||||||
|
CompressionMethod,
|
||||||
|
GnuTarPathOnWindows,
|
||||||
|
ManifestFilename,
|
||||||
|
SystemTarPathOnWindows,
|
||||||
|
TarFilename
|
||||||
|
} from '../src/internal/constants'
|
||||||
|
import * as tar from '../src/internal/tar'
|
||||||
|
import * as utils from '../src/internal/cacheUtils'
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-require-imports
|
||||||
|
import fs = require('fs')
|
||||||
|
|
||||||
|
jest.mock('@actions/exec')
|
||||||
|
jest.mock('@actions/io')
|
||||||
|
|
||||||
|
const IS_WINDOWS = process.platform === 'win32'
|
||||||
|
const IS_MAC = process.platform === 'darwin'
|
||||||
|
|
||||||
|
const defaultTarPath = IS_MAC ? 'gtar' : 'tar'
|
||||||
|
|
||||||
|
const defaultEnv = {MSYS: 'winsymlinks:nativestrict'}
|
||||||
|
|
||||||
|
function getTempDir(): string {
|
||||||
|
return path.join(__dirname, '_temp', 'tar')
|
||||||
|
}
|
||||||
|
|
||||||
|
beforeAll(async () => {
|
||||||
|
jest.spyOn(io, 'which').mockImplementation(async tool => {
|
||||||
|
return tool
|
||||||
|
})
|
||||||
|
|
||||||
|
process.env['GITHUB_WORKSPACE'] = process.cwd()
|
||||||
|
await jest.requireActual('@actions/io').rmRF(getTempDir())
|
||||||
|
})
|
||||||
|
|
||||||
|
beforeEach(async () => {
|
||||||
|
jest.restoreAllMocks()
|
||||||
|
})
|
||||||
|
|
||||||
|
afterAll(async () => {
|
||||||
|
delete process.env['GITHUB_WORKSPACE']
|
||||||
|
await jest.requireActual('@actions/io').rmRF(getTempDir())
|
||||||
|
})
|
||||||
|
|
||||||
|
test('zstd extract tar', async () => {
|
||||||
|
const mkdirMock = jest.spyOn(io, 'mkdirP')
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
|
||||||
|
const archivePath = IS_WINDOWS
|
||||||
|
? `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
: 'cache.tar'
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE']
|
||||||
|
const tarPath = IS_WINDOWS ? GnuTarPathOnWindows : defaultTarPath
|
||||||
|
|
||||||
|
await tar.extractTar(archivePath, CompressionMethod.Zstd)
|
||||||
|
|
||||||
|
expect(mkdirMock).toHaveBeenCalledWith(workspace)
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'-xf',
|
||||||
|
IS_WINDOWS ? archivePath.replace(/\\/g, '/') : archivePath,
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
IS_WINDOWS ? workspace?.replace(/\\/g, '/') : workspace
|
||||||
|
]
|
||||||
|
.concat(IS_WINDOWS ? ['--force-local'] : [])
|
||||||
|
.concat(IS_MAC ? ['--delay-directory-restore'] : [])
|
||||||
|
.concat([
|
||||||
|
'--use-compress-program',
|
||||||
|
IS_WINDOWS ? '"zstd -d --long=30"' : 'unzstd --long=30'
|
||||||
|
])
|
||||||
|
.join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('zstd extract tar with windows BSDtar', async () => {
|
||||||
|
if (IS_WINDOWS) {
|
||||||
|
const mkdirMock = jest.spyOn(io, 'mkdirP')
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
jest
|
||||||
|
.spyOn(utils, 'getGnuTarPathOnWindows')
|
||||||
|
.mockReturnValue(Promise.resolve(''))
|
||||||
|
|
||||||
|
const archivePath = `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE']
|
||||||
|
const tarPath = SystemTarPathOnWindows
|
||||||
|
|
||||||
|
await tar.extractTar(archivePath, CompressionMethod.Zstd)
|
||||||
|
|
||||||
|
expect(mkdirMock).toHaveBeenCalledWith(workspace)
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(2)
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenNthCalledWith(
|
||||||
|
1,
|
||||||
|
[
|
||||||
|
'zstd -d --long=30 --force -o',
|
||||||
|
TarFilename.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
|
].join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenNthCalledWith(
|
||||||
|
2,
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'-xf',
|
||||||
|
TarFilename.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
workspace?.replace(/\\/g, '/')
|
||||||
|
].join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
test('gzip extract tar', async () => {
|
||||||
|
const mkdirMock = jest.spyOn(io, 'mkdirP')
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
const archivePath = IS_WINDOWS
|
||||||
|
? `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
: 'cache.tar'
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE']
|
||||||
|
|
||||||
|
await tar.extractTar(archivePath, CompressionMethod.Gzip)
|
||||||
|
|
||||||
|
expect(mkdirMock).toHaveBeenCalledWith(workspace)
|
||||||
|
const tarPath = IS_WINDOWS ? GnuTarPathOnWindows : defaultTarPath
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'-xf',
|
||||||
|
IS_WINDOWS ? archivePath.replace(/\\/g, '/') : archivePath,
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
IS_WINDOWS ? workspace?.replace(/\\/g, '/') : workspace
|
||||||
|
]
|
||||||
|
.concat(IS_WINDOWS ? ['--force-local'] : [])
|
||||||
|
.concat(IS_MAC ? ['--delay-directory-restore'] : [])
|
||||||
|
.concat(['-z'])
|
||||||
|
.join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('gzip extract GNU tar on windows with GNUtar in path', async () => {
|
||||||
|
if (IS_WINDOWS) {
|
||||||
|
// GNU tar present in path but not at default location
|
||||||
|
jest
|
||||||
|
.spyOn(utils, 'getGnuTarPathOnWindows')
|
||||||
|
.mockReturnValue(Promise.resolve('tar'))
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
const archivePath = `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE']
|
||||||
|
|
||||||
|
await tar.extractTar(archivePath, CompressionMethod.Gzip)
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"tar"`,
|
||||||
|
'-xf',
|
||||||
|
archivePath.replace(/\\/g, '/'),
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
workspace?.replace(/\\/g, '/'),
|
||||||
|
'--force-local',
|
||||||
|
'-z'
|
||||||
|
].join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
test('zstd create tar', async () => {
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
|
||||||
|
const archiveFolder = getTempDir()
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE']
|
||||||
|
const sourceDirectories = ['~/.npm/cache', `${workspace}/dist`]
|
||||||
|
|
||||||
|
await fs.promises.mkdir(archiveFolder, {recursive: true})
|
||||||
|
|
||||||
|
await tar.createTar(archiveFolder, sourceDirectories, CompressionMethod.Zstd)
|
||||||
|
|
||||||
|
const tarPath = IS_WINDOWS ? GnuTarPathOnWindows : defaultTarPath
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'--posix',
|
||||||
|
'-cf',
|
||||||
|
IS_WINDOWS ? CacheFilename.Zstd.replace(/\\/g, '/') : CacheFilename.Zstd,
|
||||||
|
'--exclude',
|
||||||
|
IS_WINDOWS ? CacheFilename.Zstd.replace(/\\/g, '/') : CacheFilename.Zstd,
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
IS_WINDOWS ? workspace?.replace(/\\/g, '/') : workspace,
|
||||||
|
'--files-from',
|
||||||
|
ManifestFilename
|
||||||
|
]
|
||||||
|
.concat(IS_WINDOWS ? ['--force-local'] : [])
|
||||||
|
.concat(IS_MAC ? ['--delay-directory-restore'] : [])
|
||||||
|
.concat([
|
||||||
|
'--use-compress-program',
|
||||||
|
IS_WINDOWS ? '"zstd -T0 --long=30"' : 'zstdmt --long=30'
|
||||||
|
])
|
||||||
|
.join(' '),
|
||||||
|
undefined, // args
|
||||||
|
{
|
||||||
|
cwd: archiveFolder,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('zstd create tar with windows BSDtar', async () => {
|
||||||
|
if (IS_WINDOWS) {
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
jest
|
||||||
|
.spyOn(utils, 'getGnuTarPathOnWindows')
|
||||||
|
.mockReturnValue(Promise.resolve(''))
|
||||||
|
|
||||||
|
const archiveFolder = getTempDir()
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE']
|
||||||
|
const sourceDirectories = ['~/.npm/cache', `${workspace}/dist`]
|
||||||
|
|
||||||
|
await fs.promises.mkdir(archiveFolder, {recursive: true})
|
||||||
|
|
||||||
|
await tar.createTar(
|
||||||
|
archiveFolder,
|
||||||
|
sourceDirectories,
|
||||||
|
CompressionMethod.Zstd
|
||||||
|
)
|
||||||
|
|
||||||
|
const tarPath = SystemTarPathOnWindows
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(2)
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenNthCalledWith(
|
||||||
|
1,
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'--posix',
|
||||||
|
'-cf',
|
||||||
|
TarFilename.replace(/\\/g, '/'),
|
||||||
|
'--exclude',
|
||||||
|
TarFilename.replace(/\\/g, '/'),
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
workspace?.replace(/\\/g, '/'),
|
||||||
|
'--files-from',
|
||||||
|
ManifestFilename
|
||||||
|
].join(' '),
|
||||||
|
undefined, // args
|
||||||
|
{
|
||||||
|
cwd: archiveFolder,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenNthCalledWith(
|
||||||
|
2,
|
||||||
|
[
|
||||||
|
'zstd -T0 --long=30 --force -o',
|
||||||
|
CacheFilename.Zstd.replace(/\\/g, '/'),
|
||||||
|
TarFilename.replace(/\\/g, '/')
|
||||||
|
].join(' '),
|
||||||
|
undefined, // args
|
||||||
|
{
|
||||||
|
cwd: archiveFolder,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
test('gzip create tar', async () => {
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
|
||||||
|
const archiveFolder = getTempDir()
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE']
|
||||||
|
const sourceDirectories = ['~/.npm/cache', `${workspace}/dist`]
|
||||||
|
|
||||||
|
await fs.promises.mkdir(archiveFolder, {recursive: true})
|
||||||
|
|
||||||
|
await tar.createTar(archiveFolder, sourceDirectories, CompressionMethod.Gzip)
|
||||||
|
|
||||||
|
const tarPath = IS_WINDOWS ? GnuTarPathOnWindows : defaultTarPath
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'--posix',
|
||||||
|
'-cf',
|
||||||
|
IS_WINDOWS ? CacheFilename.Gzip.replace(/\\/g, '/') : CacheFilename.Gzip,
|
||||||
|
'--exclude',
|
||||||
|
IS_WINDOWS ? CacheFilename.Gzip.replace(/\\/g, '/') : CacheFilename.Gzip,
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
IS_WINDOWS ? workspace?.replace(/\\/g, '/') : workspace,
|
||||||
|
'--files-from',
|
||||||
|
ManifestFilename
|
||||||
|
]
|
||||||
|
.concat(IS_WINDOWS ? ['--force-local'] : [])
|
||||||
|
.concat(IS_MAC ? ['--delay-directory-restore'] : [])
|
||||||
|
.concat(['-z'])
|
||||||
|
.join(' '),
|
||||||
|
undefined, // args
|
||||||
|
{
|
||||||
|
cwd: archiveFolder,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('zstd list tar', async () => {
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
|
||||||
|
const archivePath = IS_WINDOWS
|
||||||
|
? `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
: 'cache.tar'
|
||||||
|
|
||||||
|
await tar.listTar(archivePath, CompressionMethod.Zstd)
|
||||||
|
|
||||||
|
const tarPath = IS_WINDOWS ? GnuTarPathOnWindows : defaultTarPath
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'-tf',
|
||||||
|
IS_WINDOWS ? archivePath.replace(/\\/g, '/') : archivePath,
|
||||||
|
'-P'
|
||||||
|
]
|
||||||
|
.concat(IS_WINDOWS ? ['--force-local'] : [])
|
||||||
|
.concat(IS_MAC ? ['--delay-directory-restore'] : [])
|
||||||
|
.concat([
|
||||||
|
'--use-compress-program',
|
||||||
|
IS_WINDOWS ? '"zstd -d --long=30"' : 'unzstd --long=30'
|
||||||
|
])
|
||||||
|
.join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('zstd list tar with windows BSDtar', async () => {
|
||||||
|
if (IS_WINDOWS) {
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
jest
|
||||||
|
.spyOn(utils, 'getGnuTarPathOnWindows')
|
||||||
|
.mockReturnValue(Promise.resolve(''))
|
||||||
|
const archivePath = `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
|
||||||
|
await tar.listTar(archivePath, CompressionMethod.Zstd)
|
||||||
|
|
||||||
|
const tarPath = SystemTarPathOnWindows
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(2)
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenNthCalledWith(
|
||||||
|
1,
|
||||||
|
[
|
||||||
|
'zstd -d --long=30 --force -o',
|
||||||
|
TarFilename.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
|
].join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(execMock).toHaveBeenNthCalledWith(
|
||||||
|
2,
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'-tf',
|
||||||
|
TarFilename.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
'-P'
|
||||||
|
].join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
test('zstdWithoutLong list tar', async () => {
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
|
||||||
|
const archivePath = IS_WINDOWS
|
||||||
|
? `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
: 'cache.tar'
|
||||||
|
|
||||||
|
await tar.listTar(archivePath, CompressionMethod.ZstdWithoutLong)
|
||||||
|
|
||||||
|
const tarPath = IS_WINDOWS ? GnuTarPathOnWindows : defaultTarPath
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'-tf',
|
||||||
|
IS_WINDOWS ? archivePath.replace(/\\/g, '/') : archivePath,
|
||||||
|
'-P'
|
||||||
|
]
|
||||||
|
.concat(IS_WINDOWS ? ['--force-local'] : [])
|
||||||
|
.concat(IS_MAC ? ['--delay-directory-restore'] : [])
|
||||||
|
.concat(['--use-compress-program', IS_WINDOWS ? '"zstd -d"' : 'unzstd'])
|
||||||
|
.join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
test('gzip list tar', async () => {
|
||||||
|
const execMock = jest.spyOn(exec, 'exec')
|
||||||
|
const archivePath = IS_WINDOWS
|
||||||
|
? `${process.env['windir']}\\fakepath\\cache.tar`
|
||||||
|
: 'cache.tar'
|
||||||
|
|
||||||
|
await tar.listTar(archivePath, CompressionMethod.Gzip)
|
||||||
|
|
||||||
|
const tarPath = IS_WINDOWS ? GnuTarPathOnWindows : defaultTarPath
|
||||||
|
expect(execMock).toHaveBeenCalledTimes(1)
|
||||||
|
expect(execMock).toHaveBeenCalledWith(
|
||||||
|
[
|
||||||
|
`"${tarPath}"`,
|
||||||
|
'-tf',
|
||||||
|
IS_WINDOWS ? archivePath.replace(/\\/g, '/') : archivePath,
|
||||||
|
'-P'
|
||||||
|
]
|
||||||
|
.concat(IS_WINDOWS ? ['--force-local'] : [])
|
||||||
|
.concat(IS_MAC ? ['--delay-directory-restore'] : [])
|
||||||
|
.concat(['-z'])
|
||||||
|
.join(' '),
|
||||||
|
undefined,
|
||||||
|
{
|
||||||
|
cwd: undefined,
|
||||||
|
env: expect.objectContaining(defaultEnv)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
})
|
|
@ -0,0 +1,36 @@
|
||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
# Validate args
|
||||||
|
prefix="$1"
|
||||||
|
if [ -z "$prefix" ]; then
|
||||||
|
echo "Must supply prefix argument"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
path="$2"
|
||||||
|
if [ -z "$path" ]; then
|
||||||
|
echo "Must specify path argument"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Sanity check GITHUB_RUN_ID defined
|
||||||
|
if [ -z "$GITHUB_RUN_ID" ]; then
|
||||||
|
echo "GITHUB_RUN_ID not defined"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Verify file exists
|
||||||
|
file="$path/test-file.txt"
|
||||||
|
echo "Checking for $file"
|
||||||
|
if [ ! -e $file ]; then
|
||||||
|
echo "File does not exist"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Verify file content
|
||||||
|
content="$(cat $file)"
|
||||||
|
echo "File content:\n$content"
|
||||||
|
if [ -z "$(echo $content | grep --fixed-strings "$prefix $GITHUB_RUN_ID")" ]; then
|
||||||
|
echo "Unexpected file content"
|
||||||
|
exit 1
|
||||||
|
fi
|
|
@ -0,0 +1,937 @@
|
||||||
|
{
|
||||||
|
"name": "github-actions.warp-cache",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"lockfileVersion": 2,
|
||||||
|
"requires": true,
|
||||||
|
"packages": {
|
||||||
|
"": {
|
||||||
|
"name": "github-actions.warp-cache",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"@actions/core": "^1.10.0",
|
||||||
|
"@actions/exec": "^1.0.1",
|
||||||
|
"@actions/glob": "^0.1.0",
|
||||||
|
"@actions/http-client": "^2.1.1",
|
||||||
|
"@actions/io": "^1.0.1",
|
||||||
|
"@azure/abort-controller": "^1.1.0",
|
||||||
|
"@azure/ms-rest-js": "^2.6.0",
|
||||||
|
"@azure/storage-blob": "^12.13.0",
|
||||||
|
"semver": "^6.3.1",
|
||||||
|
"uuid": "^3.3.3"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/semver": "^6.0.0",
|
||||||
|
"@types/uuid": "^3.4.5",
|
||||||
|
"typescript": "^5.2.2"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@actions/core": {
|
||||||
|
"version": "1.10.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/core/-/core-1.10.0.tgz",
|
||||||
|
"integrity": "sha512-2aZDDa3zrrZbP5ZYg159sNoLRb61nQ7awl5pSvIq5Qpj81vwDzdMRKzkWJGJuwVvWpvZKx7vspJALyvaaIQyug==",
|
||||||
|
"dependencies": {
|
||||||
|
"@actions/http-client": "^2.0.1",
|
||||||
|
"uuid": "^8.3.2"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@actions/core/node_modules/uuid": {
|
||||||
|
"version": "8.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
|
||||||
|
"integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==",
|
||||||
|
"bin": {
|
||||||
|
"uuid": "dist/bin/uuid"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@actions/exec": {
|
||||||
|
"version": "1.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/exec/-/exec-1.1.1.tgz",
|
||||||
|
"integrity": "sha512-+sCcHHbVdk93a0XT19ECtO/gIXoxvdsgQLzb2fE2/5sIZmWQuluYyjPQtrtTHdU1YzTZ7bAPN4sITq2xi1679w==",
|
||||||
|
"dependencies": {
|
||||||
|
"@actions/io": "^1.0.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@actions/glob": {
|
||||||
|
"version": "0.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/glob/-/glob-0.1.2.tgz",
|
||||||
|
"integrity": "sha512-SclLR7Ia5sEqjkJTPs7Sd86maMDw43p769YxBOxvPvEWuPEhpAnBsQfENOpXjFYMmhCqd127bmf+YdvJqVqR4A==",
|
||||||
|
"dependencies": {
|
||||||
|
"@actions/core": "^1.2.6",
|
||||||
|
"minimatch": "^3.0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@actions/http-client": {
|
||||||
|
"version": "2.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/http-client/-/http-client-2.1.1.tgz",
|
||||||
|
"integrity": "sha512-qhrkRMB40bbbLo7gF+0vu+X+UawOvQQqNAA/5Unx774RS8poaOhThDOG6BGmxvAnxhQnDp2BG/ZUm65xZILTpw==",
|
||||||
|
"dependencies": {
|
||||||
|
"tunnel": "^0.0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@actions/io": {
|
||||||
|
"version": "1.1.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/io/-/io-1.1.3.tgz",
|
||||||
|
"integrity": "sha512-wi9JjgKLYS7U/z8PPbco+PvTb/nRWjeoFlJ1Qer83k/3C5PHQi28hiVdeE2kHXmIL99mQFawx8qt/JPjZilJ8Q=="
|
||||||
|
},
|
||||||
|
"node_modules/@azure/abort-controller": {
|
||||||
|
"version": "1.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/abort-controller/-/abort-controller-1.1.0.tgz",
|
||||||
|
"integrity": "sha512-TrRLIoSQVzfAJX9H1JeFjzAoDGcoK1IYX1UImfceTZpsyYfWr09Ss1aHW1y5TrrR3iq6RZLBwJ3E24uwPhwahw==",
|
||||||
|
"dependencies": {
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=12.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-auth": {
|
||||||
|
"version": "1.4.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-auth/-/core-auth-1.4.0.tgz",
|
||||||
|
"integrity": "sha512-HFrcTgmuSuukRf/EdPmqBrc5l6Q5Uu+2TbuhaKbgaCpP2TfAeiNaQPAadxO+CYBRHGUzIDteMAjFspFLDLnKVQ==",
|
||||||
|
"dependencies": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=12.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-http": {
|
||||||
|
"version": "3.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-http/-/core-http-3.0.2.tgz",
|
||||||
|
"integrity": "sha512-o1wR9JrmoM0xEAa0Ue7Sp8j+uJvmqYaGoHOCT5qaVYmvgmnZDC0OvQimPA/JR3u77Sz6D1y3Xmk1y69cDU9q9A==",
|
||||||
|
"dependencies": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"@azure/core-auth": "^1.3.0",
|
||||||
|
"@azure/core-tracing": "1.0.0-preview.13",
|
||||||
|
"@azure/core-util": "^1.1.1",
|
||||||
|
"@azure/logger": "^1.0.0",
|
||||||
|
"@types/node-fetch": "^2.5.0",
|
||||||
|
"@types/tunnel": "^0.0.3",
|
||||||
|
"form-data": "^4.0.0",
|
||||||
|
"node-fetch": "^2.6.7",
|
||||||
|
"process": "^0.11.10",
|
||||||
|
"tslib": "^2.2.0",
|
||||||
|
"tunnel": "^0.0.6",
|
||||||
|
"uuid": "^8.3.0",
|
||||||
|
"xml2js": "^0.5.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-http/node_modules/form-data": {
|
||||||
|
"version": "4.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
|
||||||
|
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
|
||||||
|
"dependencies": {
|
||||||
|
"asynckit": "^0.4.0",
|
||||||
|
"combined-stream": "^1.0.8",
|
||||||
|
"mime-types": "^2.1.12"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-http/node_modules/uuid": {
|
||||||
|
"version": "8.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
|
||||||
|
"integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==",
|
||||||
|
"bin": {
|
||||||
|
"uuid": "dist/bin/uuid"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-lro": {
|
||||||
|
"version": "2.5.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-lro/-/core-lro-2.5.4.tgz",
|
||||||
|
"integrity": "sha512-3GJiMVH7/10bulzOKGrrLeG/uCBH/9VtxqaMcB9lIqAeamI/xYQSHJL/KcsLDuH+yTjYpro/u6D/MuRe4dN70Q==",
|
||||||
|
"dependencies": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"@azure/core-util": "^1.2.0",
|
||||||
|
"@azure/logger": "^1.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-paging": {
|
||||||
|
"version": "1.5.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-paging/-/core-paging-1.5.0.tgz",
|
||||||
|
"integrity": "sha512-zqWdVIt+2Z+3wqxEOGzR5hXFZ8MGKK52x4vFLw8n58pR6ZfKRx3EXYTxTaYxYHc/PexPUTyimcTWFJbji9Z6Iw==",
|
||||||
|
"dependencies": {
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-tracing": {
|
||||||
|
"version": "1.0.0-preview.13",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-tracing/-/core-tracing-1.0.0-preview.13.tgz",
|
||||||
|
"integrity": "sha512-KxDlhXyMlh2Jhj2ykX6vNEU0Vou4nHr025KoSEiz7cS3BNiHNaZcdECk/DmLkEB0as5T7b/TpRcehJ5yV6NeXQ==",
|
||||||
|
"dependencies": {
|
||||||
|
"@opentelemetry/api": "^1.0.1",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=12.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/core-util": {
|
||||||
|
"version": "1.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-util/-/core-util-1.3.2.tgz",
|
||||||
|
"integrity": "sha512-2bECOUh88RvL1pMZTcc6OzfobBeWDBf5oBbhjIhT1MV9otMVWCzpOJkkiKtrnO88y5GGBelgY8At73KGAdbkeQ==",
|
||||||
|
"dependencies": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/logger": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/logger/-/logger-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-ustrPY8MryhloQj7OWGe+HrYx+aoiOxzbXTtgblbV3xwCqpzUK36phH3XNHQKj3EPonyFUuDTfR3qFhTEAuZEg==",
|
||||||
|
"dependencies": {
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/ms-rest-js": {
|
||||||
|
"version": "2.7.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/ms-rest-js/-/ms-rest-js-2.7.0.tgz",
|
||||||
|
"integrity": "sha512-ngbzWbqF+NmztDOpLBVDxYM+XLcUj7nKhxGbSU9WtIsXfRB//cf2ZbAG5HkOrhU9/wd/ORRB6lM/d69RKVjiyA==",
|
||||||
|
"dependencies": {
|
||||||
|
"@azure/core-auth": "^1.1.4",
|
||||||
|
"abort-controller": "^3.0.0",
|
||||||
|
"form-data": "^2.5.0",
|
||||||
|
"node-fetch": "^2.6.7",
|
||||||
|
"tslib": "^1.10.0",
|
||||||
|
"tunnel": "0.0.6",
|
||||||
|
"uuid": "^8.3.2",
|
||||||
|
"xml2js": "^0.5.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/ms-rest-js/node_modules/tslib": {
|
||||||
|
"version": "1.14.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/tslib/-/tslib-1.14.1.tgz",
|
||||||
|
"integrity": "sha512-Xni35NKzjgMrwevysHTCArtLDpPvye8zV/0E4EyYn43P7/7qvQwPh9BGkHewbMulVntbigmcT7rdX3BNo9wRJg=="
|
||||||
|
},
|
||||||
|
"node_modules/@azure/ms-rest-js/node_modules/uuid": {
|
||||||
|
"version": "8.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
|
||||||
|
"integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==",
|
||||||
|
"bin": {
|
||||||
|
"uuid": "dist/bin/uuid"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@azure/storage-blob": {
|
||||||
|
"version": "12.15.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/storage-blob/-/storage-blob-12.15.0.tgz",
|
||||||
|
"integrity": "sha512-e7JBKLOFi0QVJqqLzrjx1eL3je3/Ug2IQj24cTM9b85CsnnFjLGeGjJVIjbGGZaytewiCEG7r3lRwQX7fKj0/w==",
|
||||||
|
"dependencies": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"@azure/core-http": "^3.0.0",
|
||||||
|
"@azure/core-lro": "^2.2.0",
|
||||||
|
"@azure/core-paging": "^1.1.1",
|
||||||
|
"@azure/core-tracing": "1.0.0-preview.13",
|
||||||
|
"@azure/logger": "^1.0.0",
|
||||||
|
"events": "^3.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@opentelemetry/api": {
|
||||||
|
"version": "1.4.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.4.1.tgz",
|
||||||
|
"integrity": "sha512-O2yRJce1GOc6PAy3QxFM4NzFiWzvScDC1/5ihYBL6BUEVdq0XMWN01sppE+H6bBXbaFYipjwFLEWLg5PaSOThA==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=8.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@types/node": {
|
||||||
|
"version": "20.4.6",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.4.6.tgz",
|
||||||
|
"integrity": "sha512-q0RkvNgMweWWIvSMDiXhflGUKMdIxBo2M2tYM/0kEGDueQByFzK4KZAgu5YHGFNxziTlppNpTIBcqHQAxlfHdA=="
|
||||||
|
},
|
||||||
|
"node_modules/@types/node-fetch": {
|
||||||
|
"version": "2.6.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.4.tgz",
|
||||||
|
"integrity": "sha512-1ZX9fcN4Rvkvgv4E6PAY5WXUFWFcRWxZa3EW83UjycOB9ljJCedb2CupIP4RZMEwF/M3eTcCihbBRgwtGbg5Rg==",
|
||||||
|
"dependencies": {
|
||||||
|
"@types/node": "*",
|
||||||
|
"form-data": "^3.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@types/node-fetch/node_modules/form-data": {
|
||||||
|
"version": "3.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/form-data/-/form-data-3.0.1.tgz",
|
||||||
|
"integrity": "sha512-RHkBKtLWUVwd7SqRIvCZMEvAMoGUp0XU+seQiZejj0COz3RI3hWP4sCv3gZWWLjJTd7rGwcsF5eKZGii0r/hbg==",
|
||||||
|
"dependencies": {
|
||||||
|
"asynckit": "^0.4.0",
|
||||||
|
"combined-stream": "^1.0.8",
|
||||||
|
"mime-types": "^2.1.12"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@types/semver": {
|
||||||
|
"version": "6.2.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/semver/-/semver-6.2.3.tgz",
|
||||||
|
"integrity": "sha512-KQf+QAMWKMrtBMsB8/24w53tEsxllMj6TuA80TT/5igJalLI/zm0L3oXRbIAl4Ohfc85gyHX/jhMwsVkmhLU4A==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/@types/tunnel": {
|
||||||
|
"version": "0.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/tunnel/-/tunnel-0.0.3.tgz",
|
||||||
|
"integrity": "sha512-sOUTGn6h1SfQ+gbgqC364jLFBw2lnFqkgF3q0WovEHRLMrVD1sd5aufqi/aJObLekJO+Aq5z646U4Oxy6shXMA==",
|
||||||
|
"dependencies": {
|
||||||
|
"@types/node": "*"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@types/uuid": {
|
||||||
|
"version": "3.4.10",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/uuid/-/uuid-3.4.10.tgz",
|
||||||
|
"integrity": "sha512-BgeaZuElf7DEYZhWYDTc/XcLZXdVgFkVSTa13BqKvbnmUrxr3TJFKofUxCtDO9UQOdhnV+HPOESdHiHKZOJV1A==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/abort-controller": {
|
||||||
|
"version": "3.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz",
|
||||||
|
"integrity": "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==",
|
||||||
|
"dependencies": {
|
||||||
|
"event-target-shim": "^5.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.5"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/asynckit": {
|
||||||
|
"version": "0.4.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||||
|
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q=="
|
||||||
|
},
|
||||||
|
"node_modules/balanced-match": {
|
||||||
|
"version": "1.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
|
||||||
|
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw=="
|
||||||
|
},
|
||||||
|
"node_modules/brace-expansion": {
|
||||||
|
"version": "1.1.11",
|
||||||
|
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz",
|
||||||
|
"integrity": "sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==",
|
||||||
|
"dependencies": {
|
||||||
|
"balanced-match": "^1.0.0",
|
||||||
|
"concat-map": "0.0.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/combined-stream": {
|
||||||
|
"version": "1.0.8",
|
||||||
|
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||||
|
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||||
|
"dependencies": {
|
||||||
|
"delayed-stream": "~1.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/concat-map": {
|
||||||
|
"version": "0.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
|
||||||
|
"integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg=="
|
||||||
|
},
|
||||||
|
"node_modules/delayed-stream": {
|
||||||
|
"version": "1.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||||
|
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.4.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/event-target-shim": {
|
||||||
|
"version": "5.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-5.0.1.tgz",
|
||||||
|
"integrity": "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/events": {
|
||||||
|
"version": "3.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/events/-/events-3.3.0.tgz",
|
||||||
|
"integrity": "sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.8.x"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/form-data": {
|
||||||
|
"version": "2.5.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/form-data/-/form-data-2.5.1.tgz",
|
||||||
|
"integrity": "sha512-m21N3WOmEEURgk6B9GLOE4RuWOFf28Lhh9qGYeNlGq4VDXUlJy2th2slBNU8Gp8EzloYZOibZJ7t5ecIrFSjVA==",
|
||||||
|
"dependencies": {
|
||||||
|
"asynckit": "^0.4.0",
|
||||||
|
"combined-stream": "^1.0.6",
|
||||||
|
"mime-types": "^2.1.12"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.12"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/mime-db": {
|
||||||
|
"version": "1.52.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
|
||||||
|
"integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/mime-types": {
|
||||||
|
"version": "2.1.35",
|
||||||
|
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
|
||||||
|
"integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
|
||||||
|
"dependencies": {
|
||||||
|
"mime-db": "1.52.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/minimatch": {
|
||||||
|
"version": "3.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
|
||||||
|
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
|
||||||
|
"dependencies": {
|
||||||
|
"brace-expansion": "^1.1.7"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": "*"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/node-fetch": {
|
||||||
|
"version": "2.6.12",
|
||||||
|
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.12.tgz",
|
||||||
|
"integrity": "sha512-C/fGU2E8ToujUivIO0H+tpQ6HWo4eEmchoPIoXtxCrVghxdKq+QOHqEZW7tuP3KlV3bC8FRMO5nMCC7Zm1VP6g==",
|
||||||
|
"dependencies": {
|
||||||
|
"whatwg-url": "^5.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": "4.x || >=6.0.0"
|
||||||
|
},
|
||||||
|
"peerDependencies": {
|
||||||
|
"encoding": "^0.1.0"
|
||||||
|
},
|
||||||
|
"peerDependenciesMeta": {
|
||||||
|
"encoding": {
|
||||||
|
"optional": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/process": {
|
||||||
|
"version": "0.11.10",
|
||||||
|
"resolved": "https://registry.npmjs.org/process/-/process-0.11.10.tgz",
|
||||||
|
"integrity": "sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/sax": {
|
||||||
|
"version": "1.2.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/sax/-/sax-1.2.4.tgz",
|
||||||
|
"integrity": "sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw=="
|
||||||
|
},
|
||||||
|
"node_modules/semver": {
|
||||||
|
"version": "6.3.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
|
||||||
|
"integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
|
||||||
|
"bin": {
|
||||||
|
"semver": "bin/semver.js"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/tr46": {
|
||||||
|
"version": "0.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
|
||||||
|
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="
|
||||||
|
},
|
||||||
|
"node_modules/tslib": {
|
||||||
|
"version": "2.6.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.6.1.tgz",
|
||||||
|
"integrity": "sha512-t0hLfiEKfMUoqhG+U1oid7Pva4bbDPHYfJNiB7BiIjRkj1pyC++4N3huJfqY6aRH6VTB0rvtzQwjM4K6qpfOig=="
|
||||||
|
},
|
||||||
|
"node_modules/tunnel": {
|
||||||
|
"version": "0.0.6",
|
||||||
|
"resolved": "https://registry.npmjs.org/tunnel/-/tunnel-0.0.6.tgz",
|
||||||
|
"integrity": "sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.6.11 <=0.7.0 || >=0.7.3"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/typescript": {
|
||||||
|
"version": "5.2.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/typescript/-/typescript-5.2.2.tgz",
|
||||||
|
"integrity": "sha512-mI4WrpHsbCIcwT9cF4FZvr80QUeKvsUsUvKDoR+X/7XHQH98xYD8YHZg7ANtz2GtZt/CBq2QJ0thkGJMHfqc1w==",
|
||||||
|
"dev": true,
|
||||||
|
"bin": {
|
||||||
|
"tsc": "bin/tsc",
|
||||||
|
"tsserver": "bin/tsserver"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.17"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/uuid": {
|
||||||
|
"version": "3.4.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-3.4.0.tgz",
|
||||||
|
"integrity": "sha512-HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A==",
|
||||||
|
"deprecated": "Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.",
|
||||||
|
"bin": {
|
||||||
|
"uuid": "bin/uuid"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/webidl-conversions": {
|
||||||
|
"version": "3.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",
|
||||||
|
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ=="
|
||||||
|
},
|
||||||
|
"node_modules/whatwg-url": {
|
||||||
|
"version": "5.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz",
|
||||||
|
"integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==",
|
||||||
|
"dependencies": {
|
||||||
|
"tr46": "~0.0.3",
|
||||||
|
"webidl-conversions": "^3.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/xml2js": {
|
||||||
|
"version": "0.5.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/xml2js/-/xml2js-0.5.0.tgz",
|
||||||
|
"integrity": "sha512-drPFnkQJik/O+uPKpqSgr22mpuFHqKdbS835iAQrUC73L2F5WkboIRd63ai/2Yg6I1jzifPFKH2NTK+cfglkIA==",
|
||||||
|
"dependencies": {
|
||||||
|
"sax": ">=0.6.0",
|
||||||
|
"xmlbuilder": "~11.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=4.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/xmlbuilder": {
|
||||||
|
"version": "11.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-11.0.1.tgz",
|
||||||
|
"integrity": "sha512-fDlsI/kFEx7gLvbecc0/ohLG50fugQp8ryHzMTuW9vSa1GJ0XYWKnhsUx7oie3G98+r56aTQIUB4kht42R3JvA==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=4.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"@actions/core": {
|
||||||
|
"version": "1.10.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/core/-/core-1.10.0.tgz",
|
||||||
|
"integrity": "sha512-2aZDDa3zrrZbP5ZYg159sNoLRb61nQ7awl5pSvIq5Qpj81vwDzdMRKzkWJGJuwVvWpvZKx7vspJALyvaaIQyug==",
|
||||||
|
"requires": {
|
||||||
|
"@actions/http-client": "^2.0.1",
|
||||||
|
"uuid": "^8.3.2"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"uuid": {
|
||||||
|
"version": "8.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
|
||||||
|
"integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg=="
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@actions/exec": {
|
||||||
|
"version": "1.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/exec/-/exec-1.1.1.tgz",
|
||||||
|
"integrity": "sha512-+sCcHHbVdk93a0XT19ECtO/gIXoxvdsgQLzb2fE2/5sIZmWQuluYyjPQtrtTHdU1YzTZ7bAPN4sITq2xi1679w==",
|
||||||
|
"requires": {
|
||||||
|
"@actions/io": "^1.0.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@actions/glob": {
|
||||||
|
"version": "0.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/glob/-/glob-0.1.2.tgz",
|
||||||
|
"integrity": "sha512-SclLR7Ia5sEqjkJTPs7Sd86maMDw43p769YxBOxvPvEWuPEhpAnBsQfENOpXjFYMmhCqd127bmf+YdvJqVqR4A==",
|
||||||
|
"requires": {
|
||||||
|
"@actions/core": "^1.2.6",
|
||||||
|
"minimatch": "^3.0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@actions/http-client": {
|
||||||
|
"version": "2.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/http-client/-/http-client-2.1.1.tgz",
|
||||||
|
"integrity": "sha512-qhrkRMB40bbbLo7gF+0vu+X+UawOvQQqNAA/5Unx774RS8poaOhThDOG6BGmxvAnxhQnDp2BG/ZUm65xZILTpw==",
|
||||||
|
"requires": {
|
||||||
|
"tunnel": "^0.0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@actions/io": {
|
||||||
|
"version": "1.1.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/@actions/io/-/io-1.1.3.tgz",
|
||||||
|
"integrity": "sha512-wi9JjgKLYS7U/z8PPbco+PvTb/nRWjeoFlJ1Qer83k/3C5PHQi28hiVdeE2kHXmIL99mQFawx8qt/JPjZilJ8Q=="
|
||||||
|
},
|
||||||
|
"@azure/abort-controller": {
|
||||||
|
"version": "1.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/abort-controller/-/abort-controller-1.1.0.tgz",
|
||||||
|
"integrity": "sha512-TrRLIoSQVzfAJX9H1JeFjzAoDGcoK1IYX1UImfceTZpsyYfWr09Ss1aHW1y5TrrR3iq6RZLBwJ3E24uwPhwahw==",
|
||||||
|
"requires": {
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/core-auth": {
|
||||||
|
"version": "1.4.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-auth/-/core-auth-1.4.0.tgz",
|
||||||
|
"integrity": "sha512-HFrcTgmuSuukRf/EdPmqBrc5l6Q5Uu+2TbuhaKbgaCpP2TfAeiNaQPAadxO+CYBRHGUzIDteMAjFspFLDLnKVQ==",
|
||||||
|
"requires": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/core-http": {
|
||||||
|
"version": "3.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-http/-/core-http-3.0.2.tgz",
|
||||||
|
"integrity": "sha512-o1wR9JrmoM0xEAa0Ue7Sp8j+uJvmqYaGoHOCT5qaVYmvgmnZDC0OvQimPA/JR3u77Sz6D1y3Xmk1y69cDU9q9A==",
|
||||||
|
"requires": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"@azure/core-auth": "^1.3.0",
|
||||||
|
"@azure/core-tracing": "1.0.0-preview.13",
|
||||||
|
"@azure/core-util": "^1.1.1",
|
||||||
|
"@azure/logger": "^1.0.0",
|
||||||
|
"@types/node-fetch": "^2.5.0",
|
||||||
|
"@types/tunnel": "^0.0.3",
|
||||||
|
"form-data": "^4.0.0",
|
||||||
|
"node-fetch": "^2.6.7",
|
||||||
|
"process": "^0.11.10",
|
||||||
|
"tslib": "^2.2.0",
|
||||||
|
"tunnel": "^0.0.6",
|
||||||
|
"uuid": "^8.3.0",
|
||||||
|
"xml2js": "^0.5.0"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"form-data": {
|
||||||
|
"version": "4.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
|
||||||
|
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
|
||||||
|
"requires": {
|
||||||
|
"asynckit": "^0.4.0",
|
||||||
|
"combined-stream": "^1.0.8",
|
||||||
|
"mime-types": "^2.1.12"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"uuid": {
|
||||||
|
"version": "8.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
|
||||||
|
"integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg=="
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/core-lro": {
|
||||||
|
"version": "2.5.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-lro/-/core-lro-2.5.4.tgz",
|
||||||
|
"integrity": "sha512-3GJiMVH7/10bulzOKGrrLeG/uCBH/9VtxqaMcB9lIqAeamI/xYQSHJL/KcsLDuH+yTjYpro/u6D/MuRe4dN70Q==",
|
||||||
|
"requires": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"@azure/core-util": "^1.2.0",
|
||||||
|
"@azure/logger": "^1.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/core-paging": {
|
||||||
|
"version": "1.5.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-paging/-/core-paging-1.5.0.tgz",
|
||||||
|
"integrity": "sha512-zqWdVIt+2Z+3wqxEOGzR5hXFZ8MGKK52x4vFLw8n58pR6ZfKRx3EXYTxTaYxYHc/PexPUTyimcTWFJbji9Z6Iw==",
|
||||||
|
"requires": {
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/core-tracing": {
|
||||||
|
"version": "1.0.0-preview.13",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-tracing/-/core-tracing-1.0.0-preview.13.tgz",
|
||||||
|
"integrity": "sha512-KxDlhXyMlh2Jhj2ykX6vNEU0Vou4nHr025KoSEiz7cS3BNiHNaZcdECk/DmLkEB0as5T7b/TpRcehJ5yV6NeXQ==",
|
||||||
|
"requires": {
|
||||||
|
"@opentelemetry/api": "^1.0.1",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/core-util": {
|
||||||
|
"version": "1.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/core-util/-/core-util-1.3.2.tgz",
|
||||||
|
"integrity": "sha512-2bECOUh88RvL1pMZTcc6OzfobBeWDBf5oBbhjIhT1MV9otMVWCzpOJkkiKtrnO88y5GGBelgY8At73KGAdbkeQ==",
|
||||||
|
"requires": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/logger": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/logger/-/logger-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-ustrPY8MryhloQj7OWGe+HrYx+aoiOxzbXTtgblbV3xwCqpzUK36phH3XNHQKj3EPonyFUuDTfR3qFhTEAuZEg==",
|
||||||
|
"requires": {
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/ms-rest-js": {
|
||||||
|
"version": "2.7.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/ms-rest-js/-/ms-rest-js-2.7.0.tgz",
|
||||||
|
"integrity": "sha512-ngbzWbqF+NmztDOpLBVDxYM+XLcUj7nKhxGbSU9WtIsXfRB//cf2ZbAG5HkOrhU9/wd/ORRB6lM/d69RKVjiyA==",
|
||||||
|
"requires": {
|
||||||
|
"@azure/core-auth": "^1.1.4",
|
||||||
|
"abort-controller": "^3.0.0",
|
||||||
|
"form-data": "^2.5.0",
|
||||||
|
"node-fetch": "^2.6.7",
|
||||||
|
"tslib": "^1.10.0",
|
||||||
|
"tunnel": "0.0.6",
|
||||||
|
"uuid": "^8.3.2",
|
||||||
|
"xml2js": "^0.5.0"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"tslib": {
|
||||||
|
"version": "1.14.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/tslib/-/tslib-1.14.1.tgz",
|
||||||
|
"integrity": "sha512-Xni35NKzjgMrwevysHTCArtLDpPvye8zV/0E4EyYn43P7/7qvQwPh9BGkHewbMulVntbigmcT7rdX3BNo9wRJg=="
|
||||||
|
},
|
||||||
|
"uuid": {
|
||||||
|
"version": "8.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
|
||||||
|
"integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg=="
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@azure/storage-blob": {
|
||||||
|
"version": "12.15.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@azure/storage-blob/-/storage-blob-12.15.0.tgz",
|
||||||
|
"integrity": "sha512-e7JBKLOFi0QVJqqLzrjx1eL3je3/Ug2IQj24cTM9b85CsnnFjLGeGjJVIjbGGZaytewiCEG7r3lRwQX7fKj0/w==",
|
||||||
|
"requires": {
|
||||||
|
"@azure/abort-controller": "^1.0.0",
|
||||||
|
"@azure/core-http": "^3.0.0",
|
||||||
|
"@azure/core-lro": "^2.2.0",
|
||||||
|
"@azure/core-paging": "^1.1.1",
|
||||||
|
"@azure/core-tracing": "1.0.0-preview.13",
|
||||||
|
"@azure/logger": "^1.0.0",
|
||||||
|
"events": "^3.0.0",
|
||||||
|
"tslib": "^2.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@opentelemetry/api": {
|
||||||
|
"version": "1.4.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.4.1.tgz",
|
||||||
|
"integrity": "sha512-O2yRJce1GOc6PAy3QxFM4NzFiWzvScDC1/5ihYBL6BUEVdq0XMWN01sppE+H6bBXbaFYipjwFLEWLg5PaSOThA=="
|
||||||
|
},
|
||||||
|
"@types/node": {
|
||||||
|
"version": "20.4.6",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.4.6.tgz",
|
||||||
|
"integrity": "sha512-q0RkvNgMweWWIvSMDiXhflGUKMdIxBo2M2tYM/0kEGDueQByFzK4KZAgu5YHGFNxziTlppNpTIBcqHQAxlfHdA=="
|
||||||
|
},
|
||||||
|
"@types/node-fetch": {
|
||||||
|
"version": "2.6.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.6.4.tgz",
|
||||||
|
"integrity": "sha512-1ZX9fcN4Rvkvgv4E6PAY5WXUFWFcRWxZa3EW83UjycOB9ljJCedb2CupIP4RZMEwF/M3eTcCihbBRgwtGbg5Rg==",
|
||||||
|
"requires": {
|
||||||
|
"@types/node": "*",
|
||||||
|
"form-data": "^3.0.0"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"form-data": {
|
||||||
|
"version": "3.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/form-data/-/form-data-3.0.1.tgz",
|
||||||
|
"integrity": "sha512-RHkBKtLWUVwd7SqRIvCZMEvAMoGUp0XU+seQiZejj0COz3RI3hWP4sCv3gZWWLjJTd7rGwcsF5eKZGii0r/hbg==",
|
||||||
|
"requires": {
|
||||||
|
"asynckit": "^0.4.0",
|
||||||
|
"combined-stream": "^1.0.8",
|
||||||
|
"mime-types": "^2.1.12"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@types/semver": {
|
||||||
|
"version": "6.2.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/semver/-/semver-6.2.3.tgz",
|
||||||
|
"integrity": "sha512-KQf+QAMWKMrtBMsB8/24w53tEsxllMj6TuA80TT/5igJalLI/zm0L3oXRbIAl4Ohfc85gyHX/jhMwsVkmhLU4A==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"@types/tunnel": {
|
||||||
|
"version": "0.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/tunnel/-/tunnel-0.0.3.tgz",
|
||||||
|
"integrity": "sha512-sOUTGn6h1SfQ+gbgqC364jLFBw2lnFqkgF3q0WovEHRLMrVD1sd5aufqi/aJObLekJO+Aq5z646U4Oxy6shXMA==",
|
||||||
|
"requires": {
|
||||||
|
"@types/node": "*"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"@types/uuid": {
|
||||||
|
"version": "3.4.10",
|
||||||
|
"resolved": "https://registry.npmjs.org/@types/uuid/-/uuid-3.4.10.tgz",
|
||||||
|
"integrity": "sha512-BgeaZuElf7DEYZhWYDTc/XcLZXdVgFkVSTa13BqKvbnmUrxr3TJFKofUxCtDO9UQOdhnV+HPOESdHiHKZOJV1A==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"abort-controller": {
|
||||||
|
"version": "3.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz",
|
||||||
|
"integrity": "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==",
|
||||||
|
"requires": {
|
||||||
|
"event-target-shim": "^5.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"asynckit": {
|
||||||
|
"version": "0.4.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||||
|
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q=="
|
||||||
|
},
|
||||||
|
"balanced-match": {
|
||||||
|
"version": "1.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
|
||||||
|
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw=="
|
||||||
|
},
|
||||||
|
"brace-expansion": {
|
||||||
|
"version": "1.1.11",
|
||||||
|
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz",
|
||||||
|
"integrity": "sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==",
|
||||||
|
"requires": {
|
||||||
|
"balanced-match": "^1.0.0",
|
||||||
|
"concat-map": "0.0.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"combined-stream": {
|
||||||
|
"version": "1.0.8",
|
||||||
|
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||||
|
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||||
|
"requires": {
|
||||||
|
"delayed-stream": "~1.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"concat-map": {
|
||||||
|
"version": "0.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
|
||||||
|
"integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg=="
|
||||||
|
},
|
||||||
|
"delayed-stream": {
|
||||||
|
"version": "1.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||||
|
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ=="
|
||||||
|
},
|
||||||
|
"event-target-shim": {
|
||||||
|
"version": "5.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-5.0.1.tgz",
|
||||||
|
"integrity": "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ=="
|
||||||
|
},
|
||||||
|
"events": {
|
||||||
|
"version": "3.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/events/-/events-3.3.0.tgz",
|
||||||
|
"integrity": "sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q=="
|
||||||
|
},
|
||||||
|
"form-data": {
|
||||||
|
"version": "2.5.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/form-data/-/form-data-2.5.1.tgz",
|
||||||
|
"integrity": "sha512-m21N3WOmEEURgk6B9GLOE4RuWOFf28Lhh9qGYeNlGq4VDXUlJy2th2slBNU8Gp8EzloYZOibZJ7t5ecIrFSjVA==",
|
||||||
|
"requires": {
|
||||||
|
"asynckit": "^0.4.0",
|
||||||
|
"combined-stream": "^1.0.6",
|
||||||
|
"mime-types": "^2.1.12"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"mime-db": {
|
||||||
|
"version": "1.52.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
|
||||||
|
"integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg=="
|
||||||
|
},
|
||||||
|
"mime-types": {
|
||||||
|
"version": "2.1.35",
|
||||||
|
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
|
||||||
|
"integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
|
||||||
|
"requires": {
|
||||||
|
"mime-db": "1.52.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"minimatch": {
|
||||||
|
"version": "3.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
|
||||||
|
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
|
||||||
|
"requires": {
|
||||||
|
"brace-expansion": "^1.1.7"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node-fetch": {
|
||||||
|
"version": "2.6.12",
|
||||||
|
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.12.tgz",
|
||||||
|
"integrity": "sha512-C/fGU2E8ToujUivIO0H+tpQ6HWo4eEmchoPIoXtxCrVghxdKq+QOHqEZW7tuP3KlV3bC8FRMO5nMCC7Zm1VP6g==",
|
||||||
|
"requires": {
|
||||||
|
"whatwg-url": "^5.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"process": {
|
||||||
|
"version": "0.11.10",
|
||||||
|
"resolved": "https://registry.npmjs.org/process/-/process-0.11.10.tgz",
|
||||||
|
"integrity": "sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A=="
|
||||||
|
},
|
||||||
|
"sax": {
|
||||||
|
"version": "1.2.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/sax/-/sax-1.2.4.tgz",
|
||||||
|
"integrity": "sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw=="
|
||||||
|
},
|
||||||
|
"semver": {
|
||||||
|
"version": "6.3.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
|
||||||
|
"integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA=="
|
||||||
|
},
|
||||||
|
"tr46": {
|
||||||
|
"version": "0.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
|
||||||
|
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="
|
||||||
|
},
|
||||||
|
"tslib": {
|
||||||
|
"version": "2.6.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.6.1.tgz",
|
||||||
|
"integrity": "sha512-t0hLfiEKfMUoqhG+U1oid7Pva4bbDPHYfJNiB7BiIjRkj1pyC++4N3huJfqY6aRH6VTB0rvtzQwjM4K6qpfOig=="
|
||||||
|
},
|
||||||
|
"tunnel": {
|
||||||
|
"version": "0.0.6",
|
||||||
|
"resolved": "https://registry.npmjs.org/tunnel/-/tunnel-0.0.6.tgz",
|
||||||
|
"integrity": "sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg=="
|
||||||
|
},
|
||||||
|
"typescript": {
|
||||||
|
"version": "5.2.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/typescript/-/typescript-5.2.2.tgz",
|
||||||
|
"integrity": "sha512-mI4WrpHsbCIcwT9cF4FZvr80QUeKvsUsUvKDoR+X/7XHQH98xYD8YHZg7ANtz2GtZt/CBq2QJ0thkGJMHfqc1w==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"uuid": {
|
||||||
|
"version": "3.4.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/uuid/-/uuid-3.4.0.tgz",
|
||||||
|
"integrity": "sha512-HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A=="
|
||||||
|
},
|
||||||
|
"webidl-conversions": {
|
||||||
|
"version": "3.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",
|
||||||
|
"integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ=="
|
||||||
|
},
|
||||||
|
"whatwg-url": {
|
||||||
|
"version": "5.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz",
|
||||||
|
"integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==",
|
||||||
|
"requires": {
|
||||||
|
"tr46": "~0.0.3",
|
||||||
|
"webidl-conversions": "^3.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"xml2js": {
|
||||||
|
"version": "0.5.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/xml2js/-/xml2js-0.5.0.tgz",
|
||||||
|
"integrity": "sha512-drPFnkQJik/O+uPKpqSgr22mpuFHqKdbS835iAQrUC73L2F5WkboIRd63ai/2Yg6I1jzifPFKH2NTK+cfglkIA==",
|
||||||
|
"requires": {
|
||||||
|
"sax": ">=0.6.0",
|
||||||
|
"xmlbuilder": "~11.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"xmlbuilder": {
|
||||||
|
"version": "11.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-11.0.1.tgz",
|
||||||
|
"integrity": "sha512-fDlsI/kFEx7gLvbecc0/ohLG50fugQp8ryHzMTuW9vSa1GJ0XYWKnhsUx7oie3G98+r56aTQIUB4kht42R3JvA=="
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,57 @@
|
||||||
|
{
|
||||||
|
"name": "github-actions.warp-cache",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"preview": true,
|
||||||
|
"description": "Github action to use WarpBuild's in-house cache offering",
|
||||||
|
"keywords": [
|
||||||
|
"github",
|
||||||
|
"actions",
|
||||||
|
"cache",
|
||||||
|
"warpbuild"
|
||||||
|
],
|
||||||
|
"homepage": "https://github.com/actions/toolkit/tree/main/packages/cache",
|
||||||
|
"license": "MIT",
|
||||||
|
"main": "lib/cache.js",
|
||||||
|
"types": "lib/cache.d.ts",
|
||||||
|
"directories": {
|
||||||
|
"lib": "lib",
|
||||||
|
"test": "__tests__"
|
||||||
|
},
|
||||||
|
"files": [
|
||||||
|
"lib",
|
||||||
|
"!.DS_Store"
|
||||||
|
],
|
||||||
|
"publishConfig": {
|
||||||
|
"access": "public"
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "git+https://github.com/actions/toolkit.git",
|
||||||
|
"directory": "packages/cache"
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"audit-moderate": "npm install && npm audit --json --audit-level=moderate > audit.json",
|
||||||
|
"test": "echo \"Error: run tests from root\" && exit 1",
|
||||||
|
"tsc": "tsc"
|
||||||
|
},
|
||||||
|
"bugs": {
|
||||||
|
"url": "https://github.com/actions/toolkit/issues"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"@actions/core": "^1.10.0",
|
||||||
|
"@actions/exec": "^1.0.1",
|
||||||
|
"@actions/glob": "^0.1.0",
|
||||||
|
"@actions/http-client": "^2.1.1",
|
||||||
|
"@actions/io": "^1.0.1",
|
||||||
|
"@azure/abort-controller": "^1.1.0",
|
||||||
|
"@azure/ms-rest-js": "^2.6.0",
|
||||||
|
"@azure/storage-blob": "^12.13.0",
|
||||||
|
"semver": "^6.3.1",
|
||||||
|
"uuid": "^3.3.3"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/semver": "^6.0.0",
|
||||||
|
"@types/uuid": "^3.4.5",
|
||||||
|
"typescript": "^5.2.2"
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,260 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import * as path from 'path'
|
||||||
|
import * as utils from './internal/cacheUtils'
|
||||||
|
import * as cacheHttpClient from './internal/cacheHttpClient'
|
||||||
|
import {createTar, extractTar, listTar} from './internal/tar'
|
||||||
|
import {DownloadOptions, UploadOptions} from './options'
|
||||||
|
|
||||||
|
export class ValidationError extends Error {
|
||||||
|
constructor(message: string) {
|
||||||
|
super(message)
|
||||||
|
this.name = 'ValidationError'
|
||||||
|
Object.setPrototypeOf(this, ValidationError.prototype)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class ReserveCacheError extends Error {
|
||||||
|
constructor(message: string) {
|
||||||
|
super(message)
|
||||||
|
this.name = 'ReserveCacheError'
|
||||||
|
Object.setPrototypeOf(this, ReserveCacheError.prototype)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function checkPaths(paths: string[]): void {
|
||||||
|
if (!paths || paths.length === 0) {
|
||||||
|
throw new ValidationError(
|
||||||
|
`Path Validation Error: At least one directory or file path is required`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function checkKey(key: string): void {
|
||||||
|
if (key.length > 512) {
|
||||||
|
throw new ValidationError(
|
||||||
|
`Key Validation Error: ${key} cannot be larger than 512 characters.`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
const regex = /^[^,]*$/
|
||||||
|
if (!regex.test(key)) {
|
||||||
|
throw new ValidationError(
|
||||||
|
`Key Validation Error: ${key} cannot contain commas.`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* isFeatureAvailable to check the presence of Actions cache service
|
||||||
|
*
|
||||||
|
* @returns boolean return true if Actions cache service feature is available, otherwise false
|
||||||
|
*/
|
||||||
|
|
||||||
|
export function isFeatureAvailable(): boolean {
|
||||||
|
return !!process.env['ACTIONS_CACHE_URL']
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Restores cache from keys
|
||||||
|
*
|
||||||
|
* @param paths a list of file paths to restore from the cache
|
||||||
|
* @param primaryKey an explicit key for restoring the cache
|
||||||
|
* @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key
|
||||||
|
* @param downloadOptions cache download options
|
||||||
|
* @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform
|
||||||
|
* @returns string returns the key for the cache hit, otherwise returns undefined
|
||||||
|
*/
|
||||||
|
export async function restoreCache(
|
||||||
|
paths: string[],
|
||||||
|
primaryKey: string,
|
||||||
|
restoreKeys?: string[],
|
||||||
|
options?: DownloadOptions,
|
||||||
|
enableCrossOsArchive = false
|
||||||
|
): Promise<string | undefined> {
|
||||||
|
checkPaths(paths)
|
||||||
|
|
||||||
|
restoreKeys = restoreKeys ?? []
|
||||||
|
const keys = [primaryKey, ...restoreKeys]
|
||||||
|
|
||||||
|
core.debug('Resolved Keys:')
|
||||||
|
core.debug(JSON.stringify(keys))
|
||||||
|
|
||||||
|
if (keys.length > 10) {
|
||||||
|
throw new ValidationError(
|
||||||
|
`Key Validation Error: Keys are limited to a maximum of 10.`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
for (const key of keys) {
|
||||||
|
checkKey(key)
|
||||||
|
}
|
||||||
|
|
||||||
|
const compressionMethod = await utils.getCompressionMethod()
|
||||||
|
let archivePath = ''
|
||||||
|
try {
|
||||||
|
// path are needed to compute version
|
||||||
|
const cacheEntry = await cacheHttpClient.getCacheEntry(keys, paths, {
|
||||||
|
compressionMethod,
|
||||||
|
enableCrossOsArchive
|
||||||
|
})
|
||||||
|
if (!cacheEntry?.archiveLocation) {
|
||||||
|
// Cache not found
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
if (options?.lookupOnly) {
|
||||||
|
core.info('Lookup only - skipping download')
|
||||||
|
return cacheEntry.cacheKey
|
||||||
|
}
|
||||||
|
|
||||||
|
archivePath = path.join(
|
||||||
|
await utils.createTempDirectory(),
|
||||||
|
utils.getCacheFileName(compressionMethod)
|
||||||
|
)
|
||||||
|
core.debug(`Archive Path: ${archivePath}`)
|
||||||
|
|
||||||
|
// Download the cache from the cache entry
|
||||||
|
await cacheHttpClient.downloadCache(
|
||||||
|
cacheEntry.archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
options
|
||||||
|
)
|
||||||
|
|
||||||
|
if (core.isDebug()) {
|
||||||
|
await listTar(archivePath, compressionMethod)
|
||||||
|
}
|
||||||
|
|
||||||
|
const archiveFileSize = utils.getArchiveFileSizeInBytes(archivePath)
|
||||||
|
core.info(
|
||||||
|
`Cache Size: ~${Math.round(
|
||||||
|
archiveFileSize / (1024 * 1024)
|
||||||
|
)} MB (${archiveFileSize} B)`
|
||||||
|
)
|
||||||
|
|
||||||
|
await extractTar(archivePath, compressionMethod)
|
||||||
|
core.info('Cache restored successfully')
|
||||||
|
|
||||||
|
return cacheEntry.cacheKey
|
||||||
|
} catch (error) {
|
||||||
|
const typedError = error as Error
|
||||||
|
if (typedError.name === ValidationError.name) {
|
||||||
|
throw error
|
||||||
|
} else {
|
||||||
|
// Supress all non-validation cache related errors because caching should be optional
|
||||||
|
core.warning(`Failed to restore: ${(error as Error).message}`)
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
// Try to delete the archive to save space
|
||||||
|
try {
|
||||||
|
await utils.unlinkFile(archivePath)
|
||||||
|
} catch (error) {
|
||||||
|
core.debug(`Failed to delete archive: ${error}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Saves a list of files with the specified key
|
||||||
|
*
|
||||||
|
* @param paths a list of file paths to be cached
|
||||||
|
* @param key an explicit key for restoring the cache
|
||||||
|
* @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform
|
||||||
|
* @param options cache upload options
|
||||||
|
* @returns number returns cacheId if the cache was saved successfully and throws an error if save fails
|
||||||
|
*/
|
||||||
|
export async function saveCache(
|
||||||
|
paths: string[],
|
||||||
|
key: string,
|
||||||
|
options?: UploadOptions,
|
||||||
|
enableCrossOsArchive = false
|
||||||
|
): Promise<number> {
|
||||||
|
checkPaths(paths)
|
||||||
|
checkKey(key)
|
||||||
|
|
||||||
|
const compressionMethod = await utils.getCompressionMethod()
|
||||||
|
let cacheId = -1
|
||||||
|
|
||||||
|
const cachePaths = await utils.resolvePaths(paths)
|
||||||
|
core.debug('Cache Paths:')
|
||||||
|
core.debug(`${JSON.stringify(cachePaths)}`)
|
||||||
|
|
||||||
|
if (cachePaths.length === 0) {
|
||||||
|
throw new Error(
|
||||||
|
`Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
const archiveFolder = await utils.createTempDirectory()
|
||||||
|
const archivePath = path.join(
|
||||||
|
archiveFolder,
|
||||||
|
utils.getCacheFileName(compressionMethod)
|
||||||
|
)
|
||||||
|
|
||||||
|
core.debug(`Archive Path: ${archivePath}`)
|
||||||
|
|
||||||
|
try {
|
||||||
|
await createTar(archiveFolder, cachePaths, compressionMethod)
|
||||||
|
if (core.isDebug()) {
|
||||||
|
await listTar(archivePath, compressionMethod)
|
||||||
|
}
|
||||||
|
const fileSizeLimit = 10 * 1024 * 1024 * 1024 // 10GB per repo limit
|
||||||
|
const archiveFileSize = utils.getArchiveFileSizeInBytes(archivePath)
|
||||||
|
core.debug(`File Size: ${archiveFileSize}`)
|
||||||
|
|
||||||
|
// For GHES, this check will take place in ReserveCache API with enterprise file size limit
|
||||||
|
if (archiveFileSize > fileSizeLimit && !utils.isGhes()) {
|
||||||
|
throw new Error(
|
||||||
|
`Cache size of ~${Math.round(
|
||||||
|
archiveFileSize / (1024 * 1024)
|
||||||
|
)} MB (${archiveFileSize} B) is over the 10GB limit, not saving cache.`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
core.debug('Reserving Cache')
|
||||||
|
const reserveCacheResponse = await cacheHttpClient.reserveCache(
|
||||||
|
key,
|
||||||
|
paths,
|
||||||
|
{
|
||||||
|
compressionMethod,
|
||||||
|
enableCrossOsArchive,
|
||||||
|
cacheSize: archiveFileSize
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
if (reserveCacheResponse?.result?.cacheId) {
|
||||||
|
cacheId = reserveCacheResponse?.result?.cacheId
|
||||||
|
} else if (reserveCacheResponse?.statusCode === 400) {
|
||||||
|
throw new Error(
|
||||||
|
reserveCacheResponse?.error?.message ??
|
||||||
|
`Cache size of ~${Math.round(
|
||||||
|
archiveFileSize / (1024 * 1024)
|
||||||
|
)} MB (${archiveFileSize} B) is over the data cap limit, not saving cache.`
|
||||||
|
)
|
||||||
|
} else {
|
||||||
|
throw new ReserveCacheError(
|
||||||
|
`Unable to reserve cache with key ${key}, another job may be creating this cache. More details: ${reserveCacheResponse?.error?.message}`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
core.debug(`Saving Cache (ID: ${cacheId})`)
|
||||||
|
await cacheHttpClient.saveCache(cacheId, archivePath, options)
|
||||||
|
} catch (error) {
|
||||||
|
const typedError = error as Error
|
||||||
|
if (typedError.name === ValidationError.name) {
|
||||||
|
throw error
|
||||||
|
} else if (typedError.name === ReserveCacheError.name) {
|
||||||
|
core.info(`Failed to save: ${typedError.message}`)
|
||||||
|
} else {
|
||||||
|
core.warning(`Failed to save: ${typedError.message}`)
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
// Try to delete the archive to save space
|
||||||
|
try {
|
||||||
|
await utils.unlinkFile(archivePath)
|
||||||
|
} catch (error) {
|
||||||
|
core.debug(`Failed to delete archive: ${error}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return cacheId
|
||||||
|
}
|
|
@ -0,0 +1,376 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import {HttpClient} from '@actions/http-client'
|
||||||
|
import {BearerCredentialHandler} from '@actions/http-client/lib/auth'
|
||||||
|
import {
|
||||||
|
RequestOptions,
|
||||||
|
TypedResponse
|
||||||
|
} from '@actions/http-client/lib/interfaces'
|
||||||
|
import * as crypto from 'crypto'
|
||||||
|
import * as fs from 'fs'
|
||||||
|
import {URL} from 'url'
|
||||||
|
|
||||||
|
import * as utils from './cacheUtils'
|
||||||
|
import {CompressionMethod} from './constants'
|
||||||
|
import {
|
||||||
|
ArtifactCacheEntry,
|
||||||
|
InternalCacheOptions,
|
||||||
|
CommitCacheRequest,
|
||||||
|
ReserveCacheRequest,
|
||||||
|
ReserveCacheResponse,
|
||||||
|
ITypedResponseWithError,
|
||||||
|
ArtifactCacheList
|
||||||
|
} from './contracts'
|
||||||
|
import {
|
||||||
|
downloadCacheHttpClient,
|
||||||
|
downloadCacheHttpClientConcurrent,
|
||||||
|
downloadCacheStorageSDK
|
||||||
|
} from './downloadUtils'
|
||||||
|
import {
|
||||||
|
DownloadOptions,
|
||||||
|
UploadOptions,
|
||||||
|
getDownloadOptions,
|
||||||
|
getUploadOptions
|
||||||
|
} from '../options'
|
||||||
|
import {
|
||||||
|
isSuccessStatusCode,
|
||||||
|
retryHttpClientResponse,
|
||||||
|
retryTypedResponse
|
||||||
|
} from './requestUtils'
|
||||||
|
|
||||||
|
const versionSalt = '1.0'
|
||||||
|
|
||||||
|
function getCacheApiUrl(resource: string): string {
|
||||||
|
const baseUrl: string = process.env['ACTIONS_CACHE_URL'] ?? 'localhost:8000'
|
||||||
|
if (!baseUrl) {
|
||||||
|
throw new Error('Cache Service Url not found, unable to restore cache.')
|
||||||
|
}
|
||||||
|
|
||||||
|
const url = `${baseUrl}/v1/cache/${resource}`
|
||||||
|
core.debug(`Resource Url: ${url}`)
|
||||||
|
return url
|
||||||
|
}
|
||||||
|
|
||||||
|
function createAcceptHeader(type: string, apiVersion: string): string {
|
||||||
|
return `${type};api-version=${apiVersion}`
|
||||||
|
}
|
||||||
|
|
||||||
|
function getRequestOptions(): RequestOptions {
|
||||||
|
const requestOptions: RequestOptions = {
|
||||||
|
headers: {
|
||||||
|
Accept: createAcceptHeader('application/json', 'v1')
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return requestOptions
|
||||||
|
}
|
||||||
|
|
||||||
|
function createHttpClient(): HttpClient {
|
||||||
|
const token = process.env['WARP_ACTION_TOKEN'] ?? ''
|
||||||
|
const bearerCredentialHandler = new BearerCredentialHandler(token)
|
||||||
|
|
||||||
|
return new HttpClient(
|
||||||
|
'actions/cache',
|
||||||
|
[bearerCredentialHandler],
|
||||||
|
getRequestOptions()
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getCacheVersion(
|
||||||
|
paths: string[],
|
||||||
|
compressionMethod?: CompressionMethod,
|
||||||
|
enableCrossOsArchive = false
|
||||||
|
): string {
|
||||||
|
const components = paths
|
||||||
|
|
||||||
|
// Add compression method to cache version to restore
|
||||||
|
// compressed cache as per compression method
|
||||||
|
if (compressionMethod) {
|
||||||
|
components.push(compressionMethod)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Only check for windows platforms if enableCrossOsArchive is false
|
||||||
|
if (process.platform === 'win32' && !enableCrossOsArchive) {
|
||||||
|
components.push('windows-only')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add salt to cache version to support breaking changes in cache entry
|
||||||
|
components.push(versionSalt)
|
||||||
|
|
||||||
|
return crypto.createHash('sha256').update(components.join('|')).digest('hex')
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function getCacheEntry(
|
||||||
|
keys: string[],
|
||||||
|
paths: string[],
|
||||||
|
options?: InternalCacheOptions
|
||||||
|
): Promise<ArtifactCacheEntry | null> {
|
||||||
|
const httpClient = createHttpClient()
|
||||||
|
const version = getCacheVersion(
|
||||||
|
paths,
|
||||||
|
options?.compressionMethod,
|
||||||
|
options?.enableCrossOsArchive
|
||||||
|
)
|
||||||
|
const resource = `cache?keys=${encodeURIComponent(
|
||||||
|
keys.join(',')
|
||||||
|
)}&version=${version}`
|
||||||
|
|
||||||
|
const response = await retryTypedResponse('getCacheEntry', async () =>
|
||||||
|
httpClient.getJson<ArtifactCacheEntry>(getCacheApiUrl(resource))
|
||||||
|
)
|
||||||
|
// Cache not found
|
||||||
|
if (response.statusCode === 204) {
|
||||||
|
// List cache for primary key only if cache miss occurs
|
||||||
|
if (core.isDebug()) {
|
||||||
|
await printCachesListForDiagnostics(keys[0], httpClient, version)
|
||||||
|
}
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
if (!isSuccessStatusCode(response.statusCode)) {
|
||||||
|
throw new Error(`Cache service responded with ${response.statusCode}`)
|
||||||
|
}
|
||||||
|
|
||||||
|
const cacheResult = response.result
|
||||||
|
const cacheDownloadUrl = cacheResult?.archiveLocation
|
||||||
|
if (!cacheDownloadUrl) {
|
||||||
|
// Cache achiveLocation not found. This should never happen, and hence bail out.
|
||||||
|
throw new Error('Cache not found.')
|
||||||
|
}
|
||||||
|
core.setSecret(cacheDownloadUrl)
|
||||||
|
core.debug(`Cache Result:`)
|
||||||
|
core.debug(JSON.stringify(cacheResult))
|
||||||
|
|
||||||
|
return cacheResult
|
||||||
|
}
|
||||||
|
|
||||||
|
async function printCachesListForDiagnostics(
|
||||||
|
key: string,
|
||||||
|
httpClient: HttpClient,
|
||||||
|
version: string
|
||||||
|
): Promise<void> {
|
||||||
|
const resource = `caches?key=${encodeURIComponent(key)}`
|
||||||
|
const response = await retryTypedResponse('listCache', async () =>
|
||||||
|
httpClient.getJson<ArtifactCacheList>(getCacheApiUrl(resource))
|
||||||
|
)
|
||||||
|
if (response.statusCode === 200) {
|
||||||
|
const cacheListResult = response.result
|
||||||
|
const totalCount = cacheListResult?.totalCount
|
||||||
|
if (totalCount && totalCount > 0) {
|
||||||
|
core.debug(
|
||||||
|
`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`
|
||||||
|
)
|
||||||
|
for (const cacheEntry of cacheListResult?.artifactCaches || []) {
|
||||||
|
core.debug(
|
||||||
|
`Cache Key: ${cacheEntry?.cacheKey}, Cache Version: ${cacheEntry?.cacheVersion}, Cache Scope: ${cacheEntry?.scope}, Cache Created: ${cacheEntry?.creationTime}`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function downloadCache(
|
||||||
|
archiveLocation: string,
|
||||||
|
archivePath: string,
|
||||||
|
options?: DownloadOptions
|
||||||
|
): Promise<void> {
|
||||||
|
const archiveUrl = new URL(archiveLocation)
|
||||||
|
const downloadOptions = getDownloadOptions(options)
|
||||||
|
|
||||||
|
if (archiveUrl.hostname.endsWith('.blob.core.windows.net')) {
|
||||||
|
if (downloadOptions.useAzureSdk) {
|
||||||
|
// Use Azure storage SDK to download caches hosted on Azure to improve speed and reliability.
|
||||||
|
await downloadCacheStorageSDK(
|
||||||
|
archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
downloadOptions
|
||||||
|
)
|
||||||
|
} else if (downloadOptions.concurrentBlobDownloads) {
|
||||||
|
// Use concurrent implementation with HttpClient to work around blob SDK issue
|
||||||
|
await downloadCacheHttpClientConcurrent(
|
||||||
|
archiveLocation,
|
||||||
|
archivePath,
|
||||||
|
downloadOptions
|
||||||
|
)
|
||||||
|
} else {
|
||||||
|
// Otherwise, download using the Actions http-client.
|
||||||
|
await downloadCacheHttpClient(archiveLocation, archivePath)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
await downloadCacheHttpClient(archiveLocation, archivePath)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reserve Cache
|
||||||
|
export async function reserveCache(
|
||||||
|
key: string,
|
||||||
|
paths: string[],
|
||||||
|
options?: InternalCacheOptions
|
||||||
|
): Promise<ITypedResponseWithError<ReserveCacheResponse>> {
|
||||||
|
const httpClient = createHttpClient()
|
||||||
|
const version = getCacheVersion(
|
||||||
|
paths,
|
||||||
|
options?.compressionMethod,
|
||||||
|
options?.enableCrossOsArchive
|
||||||
|
)
|
||||||
|
|
||||||
|
const reserveCacheRequest: ReserveCacheRequest = {
|
||||||
|
key,
|
||||||
|
version,
|
||||||
|
cacheSize: options?.cacheSize
|
||||||
|
}
|
||||||
|
const response = await retryTypedResponse('reserveCache', async () =>
|
||||||
|
httpClient.postJson<ReserveCacheResponse>(
|
||||||
|
getCacheApiUrl('caches'),
|
||||||
|
reserveCacheRequest
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return response
|
||||||
|
}
|
||||||
|
|
||||||
|
function getContentRange(start: number, end: number): string {
|
||||||
|
// Format: `bytes start-end/filesize
|
||||||
|
// start and end are inclusive
|
||||||
|
// filesize can be *
|
||||||
|
// For a 200 byte chunk starting at byte 0:
|
||||||
|
// Content-Range: bytes 0-199/*
|
||||||
|
return `bytes ${start}-${end}/*`
|
||||||
|
}
|
||||||
|
|
||||||
|
async function uploadChunk(
|
||||||
|
httpClient: HttpClient,
|
||||||
|
resourceUrl: string,
|
||||||
|
openStream: () => NodeJS.ReadableStream,
|
||||||
|
start: number,
|
||||||
|
end: number
|
||||||
|
): Promise<void> {
|
||||||
|
core.debug(
|
||||||
|
`Uploading chunk of size ${
|
||||||
|
end - start + 1
|
||||||
|
} bytes at offset ${start} with content range: ${getContentRange(
|
||||||
|
start,
|
||||||
|
end
|
||||||
|
)}`
|
||||||
|
)
|
||||||
|
const additionalHeaders = {
|
||||||
|
'Content-Type': 'application/octet-stream',
|
||||||
|
'Content-Range': getContentRange(start, end)
|
||||||
|
}
|
||||||
|
|
||||||
|
const uploadChunkResponse = await retryHttpClientResponse(
|
||||||
|
`uploadChunk (start: ${start}, end: ${end})`,
|
||||||
|
async () =>
|
||||||
|
httpClient.sendStream(
|
||||||
|
'PATCH',
|
||||||
|
resourceUrl,
|
||||||
|
openStream(),
|
||||||
|
additionalHeaders
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
if (!isSuccessStatusCode(uploadChunkResponse.message.statusCode)) {
|
||||||
|
throw new Error(
|
||||||
|
`Cache service responded with ${uploadChunkResponse.message.statusCode} during upload chunk.`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function uploadFile(
|
||||||
|
httpClient: HttpClient,
|
||||||
|
cacheId: number,
|
||||||
|
archivePath: string,
|
||||||
|
options?: UploadOptions
|
||||||
|
): Promise<void> {
|
||||||
|
// Upload Chunks
|
||||||
|
const fileSize = utils.getArchiveFileSizeInBytes(archivePath)
|
||||||
|
const resourceUrl = getCacheApiUrl(`caches/${cacheId.toString()}`)
|
||||||
|
const fd = fs.openSync(archivePath, 'r')
|
||||||
|
const uploadOptions = getUploadOptions(options)
|
||||||
|
|
||||||
|
const concurrency = utils.assertDefined(
|
||||||
|
'uploadConcurrency',
|
||||||
|
uploadOptions.uploadConcurrency
|
||||||
|
)
|
||||||
|
const maxChunkSize = utils.assertDefined(
|
||||||
|
'uploadChunkSize',
|
||||||
|
uploadOptions.uploadChunkSize
|
||||||
|
)
|
||||||
|
|
||||||
|
const parallelUploads = [...new Array(concurrency).keys()]
|
||||||
|
core.debug('Awaiting all uploads')
|
||||||
|
let offset = 0
|
||||||
|
|
||||||
|
try {
|
||||||
|
await Promise.all(
|
||||||
|
parallelUploads.map(async () => {
|
||||||
|
while (offset < fileSize) {
|
||||||
|
const chunkSize = Math.min(fileSize - offset, maxChunkSize)
|
||||||
|
const start = offset
|
||||||
|
const end = offset + chunkSize - 1
|
||||||
|
offset += maxChunkSize
|
||||||
|
|
||||||
|
await uploadChunk(
|
||||||
|
httpClient,
|
||||||
|
resourceUrl,
|
||||||
|
() =>
|
||||||
|
fs
|
||||||
|
.createReadStream(archivePath, {
|
||||||
|
fd,
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
autoClose: false
|
||||||
|
})
|
||||||
|
.on('error', error => {
|
||||||
|
throw new Error(
|
||||||
|
`Cache upload failed because file read failed with ${error.message}`
|
||||||
|
)
|
||||||
|
}),
|
||||||
|
start,
|
||||||
|
end
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
)
|
||||||
|
} finally {
|
||||||
|
fs.closeSync(fd)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function commitCache(
|
||||||
|
httpClient: HttpClient,
|
||||||
|
cacheId: number,
|
||||||
|
filesize: number
|
||||||
|
): Promise<TypedResponse<null>> {
|
||||||
|
const commitCacheRequest: CommitCacheRequest = {size: filesize}
|
||||||
|
return await retryTypedResponse('commitCache', async () =>
|
||||||
|
httpClient.postJson<null>(
|
||||||
|
getCacheApiUrl(`caches/${cacheId.toString()}`),
|
||||||
|
commitCacheRequest
|
||||||
|
)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function saveCache(
|
||||||
|
cacheId: number,
|
||||||
|
archivePath: string,
|
||||||
|
options?: UploadOptions
|
||||||
|
): Promise<void> {
|
||||||
|
const httpClient = createHttpClient()
|
||||||
|
|
||||||
|
core.debug('Upload cache')
|
||||||
|
await uploadFile(httpClient, cacheId, archivePath, options)
|
||||||
|
|
||||||
|
// Commit Cache
|
||||||
|
core.debug('Commiting cache')
|
||||||
|
const cacheSize = utils.getArchiveFileSizeInBytes(archivePath)
|
||||||
|
core.info(
|
||||||
|
`Cache Size: ~${Math.round(cacheSize / (1024 * 1024))} MB (${cacheSize} B)`
|
||||||
|
)
|
||||||
|
|
||||||
|
const commitCacheResponse = await commitCache(httpClient, cacheId, cacheSize)
|
||||||
|
if (!isSuccessStatusCode(commitCacheResponse.statusCode)) {
|
||||||
|
throw new Error(
|
||||||
|
`Cache service responded with ${commitCacheResponse.statusCode} during commit cache.`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
core.info('Cache saved successfully')
|
||||||
|
}
|
|
@ -0,0 +1,139 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import * as exec from '@actions/exec'
|
||||||
|
import * as glob from '@actions/glob'
|
||||||
|
import * as io from '@actions/io'
|
||||||
|
import * as fs from 'fs'
|
||||||
|
import * as path from 'path'
|
||||||
|
import * as semver from 'semver'
|
||||||
|
import * as util from 'util'
|
||||||
|
import {v4 as uuidV4} from 'uuid'
|
||||||
|
import {
|
||||||
|
CacheFilename,
|
||||||
|
CompressionMethod,
|
||||||
|
GnuTarPathOnWindows
|
||||||
|
} from './constants'
|
||||||
|
|
||||||
|
// From https://github.com/actions/toolkit/blob/main/packages/tool-cache/src/tool-cache.ts#L23
|
||||||
|
export async function createTempDirectory(): Promise<string> {
|
||||||
|
const IS_WINDOWS = process.platform === 'win32'
|
||||||
|
|
||||||
|
let tempDirectory: string = process.env['RUNNER_TEMP'] || ''
|
||||||
|
|
||||||
|
if (!tempDirectory) {
|
||||||
|
let baseLocation: string
|
||||||
|
if (IS_WINDOWS) {
|
||||||
|
// On Windows use the USERPROFILE env variable
|
||||||
|
baseLocation = process.env['USERPROFILE'] || 'C:\\'
|
||||||
|
} else {
|
||||||
|
if (process.platform === 'darwin') {
|
||||||
|
baseLocation = '/Users'
|
||||||
|
} else {
|
||||||
|
baseLocation = '/home'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
tempDirectory = path.join(baseLocation, 'actions', 'temp')
|
||||||
|
}
|
||||||
|
|
||||||
|
const dest = path.join(tempDirectory, uuidV4())
|
||||||
|
await io.mkdirP(dest)
|
||||||
|
return dest
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getArchiveFileSizeInBytes(filePath: string): number {
|
||||||
|
return fs.statSync(filePath).size
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function resolvePaths(patterns: string[]): Promise<string[]> {
|
||||||
|
const paths: string[] = []
|
||||||
|
const workspace = process.env['GITHUB_WORKSPACE'] ?? process.cwd()
|
||||||
|
const globber = await glob.create(patterns.join('\n'), {
|
||||||
|
implicitDescendants: false
|
||||||
|
})
|
||||||
|
|
||||||
|
for await (const file of globber.globGenerator()) {
|
||||||
|
const relativeFile = path
|
||||||
|
.relative(workspace, file)
|
||||||
|
.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
|
core.debug(`Matched: ${relativeFile}`)
|
||||||
|
// Paths are made relative so the tar entries are all relative to the root of the workspace.
|
||||||
|
if (relativeFile === '') {
|
||||||
|
// path.relative returns empty string if workspace and file are equal
|
||||||
|
paths.push('.')
|
||||||
|
} else {
|
||||||
|
paths.push(`${relativeFile}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return paths
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function unlinkFile(filePath: fs.PathLike): Promise<void> {
|
||||||
|
return util.promisify(fs.unlink)(filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
async function getVersion(
|
||||||
|
app: string,
|
||||||
|
additionalArgs: string[] = []
|
||||||
|
): Promise<string> {
|
||||||
|
let versionOutput = ''
|
||||||
|
additionalArgs.push('--version')
|
||||||
|
core.debug(`Checking ${app} ${additionalArgs.join(' ')}`)
|
||||||
|
try {
|
||||||
|
await exec.exec(`${app}`, additionalArgs, {
|
||||||
|
ignoreReturnCode: true,
|
||||||
|
silent: true,
|
||||||
|
listeners: {
|
||||||
|
stdout: (data: Buffer): string => (versionOutput += data.toString()),
|
||||||
|
stderr: (data: Buffer): string => (versionOutput += data.toString())
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (err) {
|
||||||
|
core.debug(err.message)
|
||||||
|
}
|
||||||
|
|
||||||
|
versionOutput = versionOutput.trim()
|
||||||
|
core.debug(versionOutput)
|
||||||
|
return versionOutput
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use zstandard if possible to maximize cache performance
|
||||||
|
export async function getCompressionMethod(): Promise<CompressionMethod> {
|
||||||
|
const versionOutput = await getVersion('zstd', ['--quiet'])
|
||||||
|
const version = semver.clean(versionOutput)
|
||||||
|
core.debug(`zstd version: ${version}`)
|
||||||
|
|
||||||
|
if (versionOutput === '') {
|
||||||
|
return CompressionMethod.Gzip
|
||||||
|
} else {
|
||||||
|
return CompressionMethod.ZstdWithoutLong
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getCacheFileName(compressionMethod: CompressionMethod): string {
|
||||||
|
return compressionMethod === CompressionMethod.Gzip
|
||||||
|
? CacheFilename.Gzip
|
||||||
|
: CacheFilename.Zstd
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function getGnuTarPathOnWindows(): Promise<string> {
|
||||||
|
if (fs.existsSync(GnuTarPathOnWindows)) {
|
||||||
|
return GnuTarPathOnWindows
|
||||||
|
}
|
||||||
|
const versionOutput = await getVersion('tar')
|
||||||
|
return versionOutput.toLowerCase().includes('gnu tar') ? io.which('tar') : ''
|
||||||
|
}
|
||||||
|
|
||||||
|
export function assertDefined<T>(name: string, value?: T): T {
|
||||||
|
if (value === undefined) {
|
||||||
|
throw Error(`Expected ${name} but value was undefiend`)
|
||||||
|
}
|
||||||
|
|
||||||
|
return value
|
||||||
|
}
|
||||||
|
|
||||||
|
export function isGhes(): boolean {
|
||||||
|
const ghUrl = new URL(
|
||||||
|
process.env['GITHUB_SERVER_URL'] || 'https://github.com'
|
||||||
|
)
|
||||||
|
return ghUrl.hostname.toUpperCase() !== 'GITHUB.COM'
|
||||||
|
}
|
|
@ -0,0 +1,38 @@
|
||||||
|
export enum CacheFilename {
|
||||||
|
Gzip = 'cache.tgz',
|
||||||
|
Zstd = 'cache.tzst'
|
||||||
|
}
|
||||||
|
|
||||||
|
export enum CompressionMethod {
|
||||||
|
Gzip = 'gzip',
|
||||||
|
// Long range mode was added to zstd in v1.3.2.
|
||||||
|
// This enum is for earlier version of zstd that does not have --long support
|
||||||
|
ZstdWithoutLong = 'zstd-without-long',
|
||||||
|
Zstd = 'zstd'
|
||||||
|
}
|
||||||
|
|
||||||
|
export enum ArchiveToolType {
|
||||||
|
GNU = 'gnu',
|
||||||
|
BSD = 'bsd'
|
||||||
|
}
|
||||||
|
|
||||||
|
// The default number of retry attempts.
|
||||||
|
export const DefaultRetryAttempts = 2
|
||||||
|
|
||||||
|
// The default delay in milliseconds between retry attempts.
|
||||||
|
export const DefaultRetryDelay = 5000
|
||||||
|
|
||||||
|
// Socket timeout in milliseconds during download. If no traffic is received
|
||||||
|
// over the socket during this period, the socket is destroyed and the download
|
||||||
|
// is aborted.
|
||||||
|
export const SocketTimeout = 5000
|
||||||
|
|
||||||
|
// The default path of GNUtar on hosted Windows runners
|
||||||
|
export const GnuTarPathOnWindows = `${process.env['PROGRAMFILES']}\\Git\\usr\\bin\\tar.exe`
|
||||||
|
|
||||||
|
// The default path of BSDtar on hosted Windows runners
|
||||||
|
export const SystemTarPathOnWindows = `${process.env['SYSTEMDRIVE']}\\Windows\\System32\\tar.exe`
|
||||||
|
|
||||||
|
export const TarFilename = 'cache.tar'
|
||||||
|
|
||||||
|
export const ManifestFilename = 'manifest.txt'
|
|
@ -0,0 +1,45 @@
|
||||||
|
import {CompressionMethod} from './constants'
|
||||||
|
import {TypedResponse} from '@actions/http-client/lib/interfaces'
|
||||||
|
import {HttpClientError} from '@actions/http-client'
|
||||||
|
|
||||||
|
export interface ITypedResponseWithError<T> extends TypedResponse<T> {
|
||||||
|
error?: HttpClientError
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ArtifactCacheEntry {
|
||||||
|
cacheKey?: string
|
||||||
|
scope?: string
|
||||||
|
cacheVersion?: string
|
||||||
|
creationTime?: string
|
||||||
|
archiveLocation?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ArtifactCacheList {
|
||||||
|
totalCount: number
|
||||||
|
artifactCaches?: ArtifactCacheEntry[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CommitCacheRequest {
|
||||||
|
size: number
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ReserveCacheRequest {
|
||||||
|
key: string
|
||||||
|
version?: string
|
||||||
|
cacheSize?: number
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ReserveCacheResponse {
|
||||||
|
cacheId: number
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface InternalCacheOptions {
|
||||||
|
compressionMethod?: CompressionMethod
|
||||||
|
enableCrossOsArchive?: boolean
|
||||||
|
cacheSize?: number
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ArchiveTool {
|
||||||
|
path: string
|
||||||
|
type: string
|
||||||
|
}
|
|
@ -0,0 +1,463 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import {HttpClient, HttpClientResponse} from '@actions/http-client'
|
||||||
|
import {BlockBlobClient} from '@azure/storage-blob'
|
||||||
|
import {TransferProgressEvent} from '@azure/ms-rest-js'
|
||||||
|
import * as buffer from 'buffer'
|
||||||
|
import * as fs from 'fs'
|
||||||
|
import * as stream from 'stream'
|
||||||
|
import * as util from 'util'
|
||||||
|
|
||||||
|
import * as utils from './cacheUtils'
|
||||||
|
import {SocketTimeout} from './constants'
|
||||||
|
import {DownloadOptions} from '../options'
|
||||||
|
import {retryHttpClientResponse} from './requestUtils'
|
||||||
|
|
||||||
|
import {AbortController} from '@azure/abort-controller'
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipes the body of a HTTP response to a stream
|
||||||
|
*
|
||||||
|
* @param response the HTTP response
|
||||||
|
* @param output the writable stream
|
||||||
|
*/
|
||||||
|
async function pipeResponseToStream(
|
||||||
|
response: HttpClientResponse,
|
||||||
|
output: NodeJS.WritableStream
|
||||||
|
): Promise<void> {
|
||||||
|
const pipeline = util.promisify(stream.pipeline)
|
||||||
|
await pipeline(response.message, output)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Class for tracking the download state and displaying stats.
|
||||||
|
*/
|
||||||
|
export class DownloadProgress {
|
||||||
|
contentLength: number
|
||||||
|
segmentIndex: number
|
||||||
|
segmentSize: number
|
||||||
|
segmentOffset: number
|
||||||
|
receivedBytes: number
|
||||||
|
startTime: number
|
||||||
|
displayedComplete: boolean
|
||||||
|
timeoutHandle?: ReturnType<typeof setTimeout>
|
||||||
|
|
||||||
|
constructor(contentLength: number) {
|
||||||
|
this.contentLength = contentLength
|
||||||
|
this.segmentIndex = 0
|
||||||
|
this.segmentSize = 0
|
||||||
|
this.segmentOffset = 0
|
||||||
|
this.receivedBytes = 0
|
||||||
|
this.displayedComplete = false
|
||||||
|
this.startTime = Date.now()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Progress to the next segment. Only call this method when the previous segment
|
||||||
|
* is complete.
|
||||||
|
*
|
||||||
|
* @param segmentSize the length of the next segment
|
||||||
|
*/
|
||||||
|
nextSegment(segmentSize: number): void {
|
||||||
|
this.segmentOffset = this.segmentOffset + this.segmentSize
|
||||||
|
this.segmentIndex = this.segmentIndex + 1
|
||||||
|
this.segmentSize = segmentSize
|
||||||
|
this.receivedBytes = 0
|
||||||
|
|
||||||
|
core.debug(
|
||||||
|
`Downloading segment at offset ${this.segmentOffset} with length ${this.segmentSize}...`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Sets the number of bytes received for the current segment.
|
||||||
|
*
|
||||||
|
* @param receivedBytes the number of bytes received
|
||||||
|
*/
|
||||||
|
setReceivedBytes(receivedBytes: number): void {
|
||||||
|
this.receivedBytes = receivedBytes
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns the total number of bytes transferred.
|
||||||
|
*/
|
||||||
|
getTransferredBytes(): number {
|
||||||
|
return this.segmentOffset + this.receivedBytes
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns true if the download is complete.
|
||||||
|
*/
|
||||||
|
isDone(): boolean {
|
||||||
|
return this.getTransferredBytes() === this.contentLength
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Prints the current download stats. Once the download completes, this will print one
|
||||||
|
* last line and then stop.
|
||||||
|
*/
|
||||||
|
display(): void {
|
||||||
|
if (this.displayedComplete) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const transferredBytes = this.segmentOffset + this.receivedBytes
|
||||||
|
const percentage = (100 * (transferredBytes / this.contentLength)).toFixed(
|
||||||
|
1
|
||||||
|
)
|
||||||
|
const elapsedTime = Date.now() - this.startTime
|
||||||
|
const downloadSpeed = (
|
||||||
|
transferredBytes /
|
||||||
|
(1024 * 1024) /
|
||||||
|
(elapsedTime / 1000)
|
||||||
|
).toFixed(1)
|
||||||
|
|
||||||
|
core.info(
|
||||||
|
`Received ${transferredBytes} of ${this.contentLength} (${percentage}%), ${downloadSpeed} MBs/sec`
|
||||||
|
)
|
||||||
|
|
||||||
|
if (this.isDone()) {
|
||||||
|
this.displayedComplete = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns a function used to handle TransferProgressEvents.
|
||||||
|
*/
|
||||||
|
onProgress(): (progress: TransferProgressEvent) => void {
|
||||||
|
return (progress: TransferProgressEvent) => {
|
||||||
|
this.setReceivedBytes(progress.loadedBytes)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Starts the timer that displays the stats.
|
||||||
|
*
|
||||||
|
* @param delayInMs the delay between each write
|
||||||
|
*/
|
||||||
|
startDisplayTimer(delayInMs = 1000): void {
|
||||||
|
const displayCallback = (): void => {
|
||||||
|
this.display()
|
||||||
|
|
||||||
|
if (!this.isDone()) {
|
||||||
|
this.timeoutHandle = setTimeout(displayCallback, delayInMs)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.timeoutHandle = setTimeout(displayCallback, delayInMs)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stops the timer that displays the stats. As this typically indicates the download
|
||||||
|
* is complete, this will display one last line, unless the last line has already
|
||||||
|
* been written.
|
||||||
|
*/
|
||||||
|
stopDisplayTimer(): void {
|
||||||
|
if (this.timeoutHandle) {
|
||||||
|
clearTimeout(this.timeoutHandle)
|
||||||
|
this.timeoutHandle = undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
this.display()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Download the cache using the Actions toolkit http-client
|
||||||
|
*
|
||||||
|
* @param archiveLocation the URL for the cache
|
||||||
|
* @param archivePath the local path where the cache is saved
|
||||||
|
*/
|
||||||
|
export async function downloadCacheHttpClient(
|
||||||
|
archiveLocation: string,
|
||||||
|
archivePath: string
|
||||||
|
): Promise<void> {
|
||||||
|
const writeStream = fs.createWriteStream(archivePath)
|
||||||
|
const httpClient = new HttpClient('actions/cache')
|
||||||
|
const downloadResponse = await retryHttpClientResponse(
|
||||||
|
'downloadCache',
|
||||||
|
async () => httpClient.get(archiveLocation)
|
||||||
|
)
|
||||||
|
|
||||||
|
// Abort download if no traffic received over the socket.
|
||||||
|
downloadResponse.message.socket.setTimeout(SocketTimeout, () => {
|
||||||
|
downloadResponse.message.destroy()
|
||||||
|
core.debug(`Aborting download, socket timed out after ${SocketTimeout} ms`)
|
||||||
|
})
|
||||||
|
|
||||||
|
await pipeResponseToStream(downloadResponse, writeStream)
|
||||||
|
|
||||||
|
// Validate download size.
|
||||||
|
const contentLengthHeader = downloadResponse.message.headers['content-length']
|
||||||
|
|
||||||
|
if (contentLengthHeader) {
|
||||||
|
const expectedLength = parseInt(contentLengthHeader)
|
||||||
|
const actualLength = utils.getArchiveFileSizeInBytes(archivePath)
|
||||||
|
|
||||||
|
if (actualLength !== expectedLength) {
|
||||||
|
throw new Error(
|
||||||
|
`Incomplete download. Expected file size: ${expectedLength}, actual file size: ${actualLength}`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
core.debug('Unable to validate download, no Content-Length header')
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Download the cache using the Actions toolkit http-client concurrently
|
||||||
|
*
|
||||||
|
* @param archiveLocation the URL for the cache
|
||||||
|
* @param archivePath the local path where the cache is saved
|
||||||
|
*/
|
||||||
|
export async function downloadCacheHttpClientConcurrent(
|
||||||
|
archiveLocation: string,
|
||||||
|
archivePath: fs.PathLike,
|
||||||
|
options: DownloadOptions
|
||||||
|
): Promise<void> {
|
||||||
|
const archiveDescriptor = await fs.promises.open(archivePath, 'w')
|
||||||
|
const httpClient = new HttpClient('actions/cache', undefined, {
|
||||||
|
socketTimeout: options.timeoutInMs,
|
||||||
|
keepAlive: true
|
||||||
|
})
|
||||||
|
try {
|
||||||
|
const res = await retryHttpClientResponse(
|
||||||
|
'downloadCacheMetadata',
|
||||||
|
async () => await httpClient.request('HEAD', archiveLocation, null, {})
|
||||||
|
)
|
||||||
|
|
||||||
|
const lengthHeader = res.message.headers['content-length']
|
||||||
|
if (lengthHeader === undefined || lengthHeader === null) {
|
||||||
|
throw new Error('Content-Length not found on blob response')
|
||||||
|
}
|
||||||
|
|
||||||
|
const length = parseInt(lengthHeader)
|
||||||
|
if (Number.isNaN(length)) {
|
||||||
|
throw new Error(`Could not interpret Content-Length: ${length}`)
|
||||||
|
}
|
||||||
|
|
||||||
|
const downloads: {
|
||||||
|
offset: number
|
||||||
|
promiseGetter: () => Promise<DownloadSegment>
|
||||||
|
}[] = []
|
||||||
|
const blockSize = 4 * 1024 * 1024
|
||||||
|
|
||||||
|
for (let offset = 0; offset < length; offset += blockSize) {
|
||||||
|
const count = Math.min(blockSize, length - offset)
|
||||||
|
downloads.push({
|
||||||
|
offset,
|
||||||
|
promiseGetter: async () => {
|
||||||
|
return await downloadSegmentRetry(
|
||||||
|
httpClient,
|
||||||
|
archiveLocation,
|
||||||
|
offset,
|
||||||
|
count
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// reverse to use .pop instead of .shift
|
||||||
|
downloads.reverse()
|
||||||
|
let actives = 0
|
||||||
|
let bytesDownloaded = 0
|
||||||
|
const progress = new DownloadProgress(length)
|
||||||
|
progress.startDisplayTimer()
|
||||||
|
const progressFn = progress.onProgress()
|
||||||
|
|
||||||
|
const activeDownloads: {[offset: number]: Promise<DownloadSegment>} = []
|
||||||
|
let nextDownload:
|
||||||
|
| {offset: number; promiseGetter: () => Promise<DownloadSegment>}
|
||||||
|
| undefined
|
||||||
|
|
||||||
|
const waitAndWrite: () => Promise<void> = async () => {
|
||||||
|
const segment = await Promise.race(Object.values(activeDownloads))
|
||||||
|
await archiveDescriptor.write(
|
||||||
|
segment.buffer,
|
||||||
|
0,
|
||||||
|
segment.count,
|
||||||
|
segment.offset
|
||||||
|
)
|
||||||
|
actives--
|
||||||
|
delete activeDownloads[segment.offset]
|
||||||
|
bytesDownloaded += segment.count
|
||||||
|
progressFn({loadedBytes: bytesDownloaded})
|
||||||
|
}
|
||||||
|
|
||||||
|
while ((nextDownload = downloads.pop())) {
|
||||||
|
activeDownloads[nextDownload.offset] = nextDownload.promiseGetter()
|
||||||
|
actives++
|
||||||
|
|
||||||
|
if (actives >= (options.downloadConcurrency ?? 10)) {
|
||||||
|
await waitAndWrite()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
while (actives > 0) {
|
||||||
|
await waitAndWrite()
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
httpClient.dispose()
|
||||||
|
await archiveDescriptor.close()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function downloadSegmentRetry(
|
||||||
|
httpClient: HttpClient,
|
||||||
|
archiveLocation: string,
|
||||||
|
offset: number,
|
||||||
|
count: number
|
||||||
|
): Promise<DownloadSegment> {
|
||||||
|
const retries = 5
|
||||||
|
let failures = 0
|
||||||
|
|
||||||
|
while (true) {
|
||||||
|
try {
|
||||||
|
const timeout = 30000
|
||||||
|
const result = await promiseWithTimeout(
|
||||||
|
timeout,
|
||||||
|
downloadSegment(httpClient, archiveLocation, offset, count)
|
||||||
|
)
|
||||||
|
if (typeof result === 'string') {
|
||||||
|
throw new Error('downloadSegmentRetry failed due to timeout')
|
||||||
|
}
|
||||||
|
|
||||||
|
return result
|
||||||
|
} catch (err) {
|
||||||
|
if (failures >= retries) {
|
||||||
|
throw err
|
||||||
|
}
|
||||||
|
|
||||||
|
failures++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function downloadSegment(
|
||||||
|
httpClient: HttpClient,
|
||||||
|
archiveLocation: string,
|
||||||
|
offset: number,
|
||||||
|
count: number
|
||||||
|
): Promise<DownloadSegment> {
|
||||||
|
const partRes = await retryHttpClientResponse(
|
||||||
|
'downloadCachePart',
|
||||||
|
async () =>
|
||||||
|
await httpClient.get(archiveLocation, {
|
||||||
|
Range: `bytes=${offset}-${offset + count - 1}`
|
||||||
|
})
|
||||||
|
)
|
||||||
|
|
||||||
|
if (!partRes.readBodyBuffer) {
|
||||||
|
throw new Error('Expected HttpClientResponse to implement readBodyBuffer')
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
offset,
|
||||||
|
count,
|
||||||
|
buffer: await partRes.readBodyBuffer()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
declare class DownloadSegment {
|
||||||
|
offset: number
|
||||||
|
count: number
|
||||||
|
buffer: Buffer
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Download the cache using the Azure Storage SDK. Only call this method if the
|
||||||
|
* URL points to an Azure Storage endpoint.
|
||||||
|
*
|
||||||
|
* @param archiveLocation the URL for the cache
|
||||||
|
* @param archivePath the local path where the cache is saved
|
||||||
|
* @param options the download options with the defaults set
|
||||||
|
*/
|
||||||
|
export async function downloadCacheStorageSDK(
|
||||||
|
archiveLocation: string,
|
||||||
|
archivePath: string,
|
||||||
|
options: DownloadOptions
|
||||||
|
): Promise<void> {
|
||||||
|
const client = new BlockBlobClient(archiveLocation, undefined, {
|
||||||
|
retryOptions: {
|
||||||
|
// Override the timeout used when downloading each 4 MB chunk
|
||||||
|
// The default is 2 min / MB, which is way too slow
|
||||||
|
tryTimeoutInMs: options.timeoutInMs
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
const properties = await client.getProperties()
|
||||||
|
const contentLength = properties.contentLength ?? -1
|
||||||
|
|
||||||
|
if (contentLength < 0) {
|
||||||
|
// We should never hit this condition, but just in case fall back to downloading the
|
||||||
|
// file as one large stream
|
||||||
|
core.debug(
|
||||||
|
'Unable to determine content length, downloading file with http-client...'
|
||||||
|
)
|
||||||
|
|
||||||
|
await downloadCacheHttpClient(archiveLocation, archivePath)
|
||||||
|
} else {
|
||||||
|
// Use downloadToBuffer for faster downloads, since internally it splits the
|
||||||
|
// file into 4 MB chunks which can then be parallelized and retried independently
|
||||||
|
//
|
||||||
|
// If the file exceeds the buffer maximum length (~1 GB on 32-bit systems and ~2 GB
|
||||||
|
// on 64-bit systems), split the download into multiple segments
|
||||||
|
// ~2 GB = 2147483647, beyond this, we start getting out of range error. So, capping it accordingly.
|
||||||
|
|
||||||
|
// Updated segment size to 128MB = 134217728 bytes, to complete a segment faster and fail fast
|
||||||
|
const maxSegmentSize = Math.min(134217728, buffer.constants.MAX_LENGTH)
|
||||||
|
const downloadProgress = new DownloadProgress(contentLength)
|
||||||
|
|
||||||
|
const fd = fs.openSync(archivePath, 'w')
|
||||||
|
|
||||||
|
try {
|
||||||
|
downloadProgress.startDisplayTimer()
|
||||||
|
const controller = new AbortController()
|
||||||
|
const abortSignal = controller.signal
|
||||||
|
while (!downloadProgress.isDone()) {
|
||||||
|
const segmentStart =
|
||||||
|
downloadProgress.segmentOffset + downloadProgress.segmentSize
|
||||||
|
|
||||||
|
const segmentSize = Math.min(
|
||||||
|
maxSegmentSize,
|
||||||
|
contentLength - segmentStart
|
||||||
|
)
|
||||||
|
|
||||||
|
downloadProgress.nextSegment(segmentSize)
|
||||||
|
const result = await promiseWithTimeout(
|
||||||
|
options.segmentTimeoutInMs || 3600000,
|
||||||
|
client.downloadToBuffer(segmentStart, segmentSize, {
|
||||||
|
abortSignal,
|
||||||
|
concurrency: options.downloadConcurrency,
|
||||||
|
onProgress: downloadProgress.onProgress()
|
||||||
|
})
|
||||||
|
)
|
||||||
|
if (result === 'timeout') {
|
||||||
|
controller.abort()
|
||||||
|
throw new Error(
|
||||||
|
'Aborting cache download as the download time exceeded the timeout.'
|
||||||
|
)
|
||||||
|
} else if (Buffer.isBuffer(result)) {
|
||||||
|
fs.writeFileSync(fd, result)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
downloadProgress.stopDisplayTimer()
|
||||||
|
fs.closeSync(fd)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const promiseWithTimeout = async <T>(
|
||||||
|
timeoutMs: number,
|
||||||
|
promise: Promise<T>
|
||||||
|
): Promise<T | string> => {
|
||||||
|
let timeoutHandle: NodeJS.Timeout
|
||||||
|
const timeoutPromise = new Promise<string>(resolve => {
|
||||||
|
timeoutHandle = setTimeout(() => resolve('timeout'), timeoutMs)
|
||||||
|
})
|
||||||
|
|
||||||
|
return Promise.race([promise, timeoutPromise]).then(result => {
|
||||||
|
clearTimeout(timeoutHandle)
|
||||||
|
return result
|
||||||
|
})
|
||||||
|
}
|
|
@ -0,0 +1,138 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
import {
|
||||||
|
HttpCodes,
|
||||||
|
HttpClientError,
|
||||||
|
HttpClientResponse
|
||||||
|
} from '@actions/http-client'
|
||||||
|
import {DefaultRetryDelay, DefaultRetryAttempts} from './constants'
|
||||||
|
import {ITypedResponseWithError} from './contracts'
|
||||||
|
|
||||||
|
export function isSuccessStatusCode(statusCode?: number): boolean {
|
||||||
|
if (!statusCode) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return statusCode >= 200 && statusCode < 300
|
||||||
|
}
|
||||||
|
|
||||||
|
export function isServerErrorStatusCode(statusCode?: number): boolean {
|
||||||
|
if (!statusCode) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return statusCode >= 500
|
||||||
|
}
|
||||||
|
|
||||||
|
export function isRetryableStatusCode(statusCode?: number): boolean {
|
||||||
|
if (!statusCode) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
const retryableStatusCodes = [
|
||||||
|
HttpCodes.BadGateway,
|
||||||
|
HttpCodes.ServiceUnavailable,
|
||||||
|
HttpCodes.GatewayTimeout
|
||||||
|
]
|
||||||
|
return retryableStatusCodes.includes(statusCode)
|
||||||
|
}
|
||||||
|
|
||||||
|
async function sleep(milliseconds: number): Promise<void> {
|
||||||
|
return new Promise(resolve => setTimeout(resolve, milliseconds))
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function retry<T>(
|
||||||
|
name: string,
|
||||||
|
method: () => Promise<T>,
|
||||||
|
getStatusCode: (arg0: T) => number | undefined,
|
||||||
|
maxAttempts = DefaultRetryAttempts,
|
||||||
|
delay = DefaultRetryDelay,
|
||||||
|
onError: ((arg0: Error) => T | undefined) | undefined = undefined
|
||||||
|
): Promise<T> {
|
||||||
|
let errorMessage = ''
|
||||||
|
let attempt = 1
|
||||||
|
|
||||||
|
while (attempt <= maxAttempts) {
|
||||||
|
let response: T | undefined = undefined
|
||||||
|
let statusCode: number | undefined = undefined
|
||||||
|
let isRetryable = false
|
||||||
|
|
||||||
|
try {
|
||||||
|
response = await method()
|
||||||
|
} catch (error) {
|
||||||
|
if (onError) {
|
||||||
|
response = onError(error)
|
||||||
|
}
|
||||||
|
|
||||||
|
isRetryable = true
|
||||||
|
errorMessage = error.message
|
||||||
|
}
|
||||||
|
|
||||||
|
if (response) {
|
||||||
|
statusCode = getStatusCode(response)
|
||||||
|
|
||||||
|
if (!isServerErrorStatusCode(statusCode)) {
|
||||||
|
return response
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (statusCode) {
|
||||||
|
isRetryable = isRetryableStatusCode(statusCode)
|
||||||
|
errorMessage = `Cache service responded with ${statusCode}`
|
||||||
|
}
|
||||||
|
|
||||||
|
core.debug(
|
||||||
|
`${name} - Attempt ${attempt} of ${maxAttempts} failed with error: ${errorMessage}`
|
||||||
|
)
|
||||||
|
|
||||||
|
if (!isRetryable) {
|
||||||
|
core.debug(`${name} - Error is not retryable`)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
await sleep(delay)
|
||||||
|
attempt++
|
||||||
|
}
|
||||||
|
|
||||||
|
throw Error(`${name} failed: ${errorMessage}`)
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function retryTypedResponse<T>(
|
||||||
|
name: string,
|
||||||
|
method: () => Promise<ITypedResponseWithError<T>>,
|
||||||
|
maxAttempts = DefaultRetryAttempts,
|
||||||
|
delay = DefaultRetryDelay
|
||||||
|
): Promise<ITypedResponseWithError<T>> {
|
||||||
|
return await retry(
|
||||||
|
name,
|
||||||
|
method,
|
||||||
|
(response: ITypedResponseWithError<T>) => response.statusCode,
|
||||||
|
maxAttempts,
|
||||||
|
delay,
|
||||||
|
// If the error object contains the statusCode property, extract it and return
|
||||||
|
// an TypedResponse<T> so it can be processed by the retry logic.
|
||||||
|
(error: Error) => {
|
||||||
|
if (error instanceof HttpClientError) {
|
||||||
|
return {
|
||||||
|
statusCode: error.statusCode,
|
||||||
|
result: null,
|
||||||
|
headers: {},
|
||||||
|
error
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function retryHttpClientResponse(
|
||||||
|
name: string,
|
||||||
|
method: () => Promise<HttpClientResponse>,
|
||||||
|
maxAttempts = DefaultRetryAttempts,
|
||||||
|
delay = DefaultRetryDelay
|
||||||
|
): Promise<HttpClientResponse> {
|
||||||
|
return await retry(
|
||||||
|
name,
|
||||||
|
method,
|
||||||
|
(response: HttpClientResponse) => response.message.statusCode,
|
||||||
|
maxAttempts,
|
||||||
|
delay
|
||||||
|
)
|
||||||
|
}
|
|
@ -0,0 +1,297 @@
|
||||||
|
import {exec} from '@actions/exec'
|
||||||
|
import * as io from '@actions/io'
|
||||||
|
import {existsSync, writeFileSync} from 'fs'
|
||||||
|
import * as path from 'path'
|
||||||
|
import * as utils from './cacheUtils'
|
||||||
|
import {ArchiveTool} from './contracts'
|
||||||
|
import {
|
||||||
|
CompressionMethod,
|
||||||
|
SystemTarPathOnWindows,
|
||||||
|
ArchiveToolType,
|
||||||
|
TarFilename,
|
||||||
|
ManifestFilename
|
||||||
|
} from './constants'
|
||||||
|
|
||||||
|
const IS_WINDOWS = process.platform === 'win32'
|
||||||
|
|
||||||
|
// Returns tar path and type: BSD or GNU
|
||||||
|
async function getTarPath(): Promise<ArchiveTool> {
|
||||||
|
switch (process.platform) {
|
||||||
|
case 'win32': {
|
||||||
|
const gnuTar = await utils.getGnuTarPathOnWindows()
|
||||||
|
const systemTar = SystemTarPathOnWindows
|
||||||
|
if (gnuTar) {
|
||||||
|
// Use GNUtar as default on windows
|
||||||
|
return <ArchiveTool>{path: gnuTar, type: ArchiveToolType.GNU}
|
||||||
|
} else if (existsSync(systemTar)) {
|
||||||
|
return <ArchiveTool>{path: systemTar, type: ArchiveToolType.BSD}
|
||||||
|
}
|
||||||
|
break
|
||||||
|
}
|
||||||
|
case 'darwin': {
|
||||||
|
const gnuTar = await io.which('gtar', false)
|
||||||
|
if (gnuTar) {
|
||||||
|
// fix permission denied errors when extracting BSD tar archive with GNU tar - https://github.com/actions/cache/issues/527
|
||||||
|
return <ArchiveTool>{path: gnuTar, type: ArchiveToolType.GNU}
|
||||||
|
} else {
|
||||||
|
return <ArchiveTool>{
|
||||||
|
path: await io.which('tar', true),
|
||||||
|
type: ArchiveToolType.BSD
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
break
|
||||||
|
}
|
||||||
|
// Default assumption is GNU tar is present in path
|
||||||
|
return <ArchiveTool>{
|
||||||
|
path: await io.which('tar', true),
|
||||||
|
type: ArchiveToolType.GNU
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return arguments for tar as per tarPath, compressionMethod, method type and os
|
||||||
|
async function getTarArgs(
|
||||||
|
tarPath: ArchiveTool,
|
||||||
|
compressionMethod: CompressionMethod,
|
||||||
|
type: string,
|
||||||
|
archivePath = ''
|
||||||
|
): Promise<string[]> {
|
||||||
|
const args = [`"${tarPath.path}"`]
|
||||||
|
const cacheFileName = utils.getCacheFileName(compressionMethod)
|
||||||
|
const tarFile = 'cache.tar'
|
||||||
|
const workingDirectory = getWorkingDirectory()
|
||||||
|
// Speficic args for BSD tar on windows for workaround
|
||||||
|
const BSD_TAR_ZSTD =
|
||||||
|
tarPath.type === ArchiveToolType.BSD &&
|
||||||
|
compressionMethod !== CompressionMethod.Gzip &&
|
||||||
|
IS_WINDOWS
|
||||||
|
|
||||||
|
// Method specific args
|
||||||
|
switch (type) {
|
||||||
|
case 'create':
|
||||||
|
args.push(
|
||||||
|
'--posix',
|
||||||
|
'-cf',
|
||||||
|
BSD_TAR_ZSTD
|
||||||
|
? tarFile
|
||||||
|
: cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
'--exclude',
|
||||||
|
BSD_TAR_ZSTD
|
||||||
|
? tarFile
|
||||||
|
: cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
workingDirectory.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
'--files-from',
|
||||||
|
ManifestFilename
|
||||||
|
)
|
||||||
|
break
|
||||||
|
case 'extract':
|
||||||
|
args.push(
|
||||||
|
'-xf',
|
||||||
|
BSD_TAR_ZSTD
|
||||||
|
? tarFile
|
||||||
|
: archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
'-P',
|
||||||
|
'-C',
|
||||||
|
workingDirectory.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
|
)
|
||||||
|
break
|
||||||
|
case 'list':
|
||||||
|
args.push(
|
||||||
|
'-tf',
|
||||||
|
BSD_TAR_ZSTD
|
||||||
|
? tarFile
|
||||||
|
: archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
'-P'
|
||||||
|
)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
// Platform specific args
|
||||||
|
if (tarPath.type === ArchiveToolType.GNU) {
|
||||||
|
switch (process.platform) {
|
||||||
|
case 'win32':
|
||||||
|
args.push('--force-local')
|
||||||
|
break
|
||||||
|
case 'darwin':
|
||||||
|
args.push('--delay-directory-restore')
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return args
|
||||||
|
}
|
||||||
|
|
||||||
|
// Returns commands to run tar and compression program
|
||||||
|
async function getCommands(
|
||||||
|
compressionMethod: CompressionMethod,
|
||||||
|
type: string,
|
||||||
|
archivePath = ''
|
||||||
|
): Promise<string[]> {
|
||||||
|
let args
|
||||||
|
|
||||||
|
const tarPath = await getTarPath()
|
||||||
|
const tarArgs = await getTarArgs(
|
||||||
|
tarPath,
|
||||||
|
compressionMethod,
|
||||||
|
type,
|
||||||
|
archivePath
|
||||||
|
)
|
||||||
|
const compressionArgs =
|
||||||
|
type !== 'create'
|
||||||
|
? await getDecompressionProgram(tarPath, compressionMethod, archivePath)
|
||||||
|
: await getCompressionProgram(tarPath, compressionMethod)
|
||||||
|
const BSD_TAR_ZSTD =
|
||||||
|
tarPath.type === ArchiveToolType.BSD &&
|
||||||
|
compressionMethod !== CompressionMethod.Gzip &&
|
||||||
|
IS_WINDOWS
|
||||||
|
|
||||||
|
if (BSD_TAR_ZSTD && type !== 'create') {
|
||||||
|
args = [[...compressionArgs].join(' '), [...tarArgs].join(' ')]
|
||||||
|
} else {
|
||||||
|
args = [[...tarArgs].join(' '), [...compressionArgs].join(' ')]
|
||||||
|
}
|
||||||
|
|
||||||
|
if (BSD_TAR_ZSTD) {
|
||||||
|
return args
|
||||||
|
}
|
||||||
|
|
||||||
|
return [args.join(' ')]
|
||||||
|
}
|
||||||
|
|
||||||
|
function getWorkingDirectory(): string {
|
||||||
|
return process.env['GITHUB_WORKSPACE'] ?? process.cwd()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Common function for extractTar and listTar to get the compression method
|
||||||
|
async function getDecompressionProgram(
|
||||||
|
tarPath: ArchiveTool,
|
||||||
|
compressionMethod: CompressionMethod,
|
||||||
|
archivePath: string
|
||||||
|
): Promise<string[]> {
|
||||||
|
// -d: Decompress.
|
||||||
|
// unzstd is equivalent to 'zstd -d'
|
||||||
|
// --long=#: Enables long distance matching with # bits. Maximum is 30 (1GB) on 32-bit OS and 31 (2GB) on 64-bit.
|
||||||
|
// Using 30 here because we also support 32-bit self-hosted runners.
|
||||||
|
const BSD_TAR_ZSTD =
|
||||||
|
tarPath.type === ArchiveToolType.BSD &&
|
||||||
|
compressionMethod !== CompressionMethod.Gzip &&
|
||||||
|
IS_WINDOWS
|
||||||
|
switch (compressionMethod) {
|
||||||
|
case CompressionMethod.Zstd:
|
||||||
|
return BSD_TAR_ZSTD
|
||||||
|
? [
|
||||||
|
'zstd -d --long=30 --force -o',
|
||||||
|
TarFilename,
|
||||||
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
|
]
|
||||||
|
: [
|
||||||
|
'--use-compress-program',
|
||||||
|
IS_WINDOWS ? '"zstd -d --long=30"' : 'unzstd --long=30'
|
||||||
|
]
|
||||||
|
case CompressionMethod.ZstdWithoutLong:
|
||||||
|
return BSD_TAR_ZSTD
|
||||||
|
? [
|
||||||
|
'zstd -d --force -o',
|
||||||
|
TarFilename,
|
||||||
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
|
]
|
||||||
|
: ['--use-compress-program', IS_WINDOWS ? '"zstd -d"' : 'unzstd']
|
||||||
|
default:
|
||||||
|
return ['-z']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Used for creating the archive
|
||||||
|
// -T#: Compress using # working thread. If # is 0, attempt to detect and use the number of physical CPU cores.
|
||||||
|
// zstdmt is equivalent to 'zstd -T0'
|
||||||
|
// --long=#: Enables long distance matching with # bits. Maximum is 30 (1GB) on 32-bit OS and 31 (2GB) on 64-bit.
|
||||||
|
// Using 30 here because we also support 32-bit self-hosted runners.
|
||||||
|
// Long range mode is added to zstd in v1.3.2 release, so we will not use --long in older version of zstd.
|
||||||
|
async function getCompressionProgram(
|
||||||
|
tarPath: ArchiveTool,
|
||||||
|
compressionMethod: CompressionMethod
|
||||||
|
): Promise<string[]> {
|
||||||
|
const cacheFileName = utils.getCacheFileName(compressionMethod)
|
||||||
|
const BSD_TAR_ZSTD =
|
||||||
|
tarPath.type === ArchiveToolType.BSD &&
|
||||||
|
compressionMethod !== CompressionMethod.Gzip &&
|
||||||
|
IS_WINDOWS
|
||||||
|
switch (compressionMethod) {
|
||||||
|
case CompressionMethod.Zstd:
|
||||||
|
return BSD_TAR_ZSTD
|
||||||
|
? [
|
||||||
|
'zstd -T0 --long=30 --force -o',
|
||||||
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
TarFilename
|
||||||
|
]
|
||||||
|
: [
|
||||||
|
'--use-compress-program',
|
||||||
|
IS_WINDOWS ? '"zstd -T0 --long=30"' : 'zstdmt --long=30'
|
||||||
|
]
|
||||||
|
case CompressionMethod.ZstdWithoutLong:
|
||||||
|
return BSD_TAR_ZSTD
|
||||||
|
? [
|
||||||
|
'zstd -T0 --force -o',
|
||||||
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
|
TarFilename
|
||||||
|
]
|
||||||
|
: ['--use-compress-program', IS_WINDOWS ? '"zstd -T0"' : 'zstdmt']
|
||||||
|
default:
|
||||||
|
return ['-z']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Executes all commands as separate processes
|
||||||
|
async function execCommands(commands: string[], cwd?: string): Promise<void> {
|
||||||
|
for (const command of commands) {
|
||||||
|
try {
|
||||||
|
await exec(command, undefined, {
|
||||||
|
cwd,
|
||||||
|
env: {...(process.env as object), MSYS: 'winsymlinks:nativestrict'}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(
|
||||||
|
`${command.split(' ')[0]} failed with error: ${error?.message}`
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// List the contents of a tar
|
||||||
|
export async function listTar(
|
||||||
|
archivePath: string,
|
||||||
|
compressionMethod: CompressionMethod
|
||||||
|
): Promise<void> {
|
||||||
|
const commands = await getCommands(compressionMethod, 'list', archivePath)
|
||||||
|
await execCommands(commands)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract a tar
|
||||||
|
export async function extractTar(
|
||||||
|
archivePath: string,
|
||||||
|
compressionMethod: CompressionMethod
|
||||||
|
): Promise<void> {
|
||||||
|
// Create directory to extract tar into
|
||||||
|
const workingDirectory = getWorkingDirectory()
|
||||||
|
await io.mkdirP(workingDirectory)
|
||||||
|
const commands = await getCommands(compressionMethod, 'extract', archivePath)
|
||||||
|
await execCommands(commands)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a tar
|
||||||
|
export async function createTar(
|
||||||
|
archiveFolder: string,
|
||||||
|
sourceDirectories: string[],
|
||||||
|
compressionMethod: CompressionMethod
|
||||||
|
): Promise<void> {
|
||||||
|
// Write source directories to manifest.txt to avoid command length limits
|
||||||
|
writeFileSync(
|
||||||
|
path.join(archiveFolder, ManifestFilename),
|
||||||
|
sourceDirectories.join('\n')
|
||||||
|
)
|
||||||
|
const commands = await getCommands(compressionMethod, 'create')
|
||||||
|
await execCommands(commands, archiveFolder)
|
||||||
|
}
|
|
@ -0,0 +1,160 @@
|
||||||
|
import * as core from '@actions/core'
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Options to control cache upload
|
||||||
|
*/
|
||||||
|
export interface UploadOptions {
|
||||||
|
/**
|
||||||
|
* Number of parallel cache upload
|
||||||
|
*
|
||||||
|
* @default 4
|
||||||
|
*/
|
||||||
|
uploadConcurrency?: number
|
||||||
|
/**
|
||||||
|
* Maximum chunk size in bytes for cache upload
|
||||||
|
*
|
||||||
|
* @default 32MB
|
||||||
|
*/
|
||||||
|
uploadChunkSize?: number
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Options to control cache download
|
||||||
|
*/
|
||||||
|
export interface DownloadOptions {
|
||||||
|
/**
|
||||||
|
* Indicates whether to use the Azure Blob SDK to download caches
|
||||||
|
* that are stored on Azure Blob Storage to improve reliability and
|
||||||
|
* performance
|
||||||
|
*
|
||||||
|
* @default true
|
||||||
|
*/
|
||||||
|
useAzureSdk?: boolean
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Number of parallel downloads (this option only applies when using
|
||||||
|
* the Azure SDK)
|
||||||
|
*
|
||||||
|
* @default 8
|
||||||
|
*/
|
||||||
|
downloadConcurrency?: number
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Indicates whether to use Actions HttpClient with concurrency
|
||||||
|
* for Azure Blob Storage
|
||||||
|
*/
|
||||||
|
concurrentBlobDownloads?: boolean
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Maximum time for each download request, in milliseconds (this
|
||||||
|
* option only applies when using the Azure SDK)
|
||||||
|
*
|
||||||
|
* @default 30000
|
||||||
|
*/
|
||||||
|
timeoutInMs?: number
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Time after which a segment download should be aborted if stuck
|
||||||
|
*
|
||||||
|
* @default 3600000
|
||||||
|
*/
|
||||||
|
segmentTimeoutInMs?: number
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Weather to skip downloading the cache entry.
|
||||||
|
* If lookupOnly is set to true, the restore function will only check if
|
||||||
|
* a matching cache entry exists and return the cache key if it does.
|
||||||
|
*
|
||||||
|
* @default false
|
||||||
|
*/
|
||||||
|
lookupOnly?: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns a copy of the upload options with defaults filled in.
|
||||||
|
*
|
||||||
|
* @param copy the original upload options
|
||||||
|
*/
|
||||||
|
export function getUploadOptions(copy?: UploadOptions): UploadOptions {
|
||||||
|
const result: UploadOptions = {
|
||||||
|
uploadConcurrency: 4,
|
||||||
|
uploadChunkSize: 32 * 1024 * 1024
|
||||||
|
}
|
||||||
|
|
||||||
|
if (copy) {
|
||||||
|
if (typeof copy.uploadConcurrency === 'number') {
|
||||||
|
result.uploadConcurrency = copy.uploadConcurrency
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof copy.uploadChunkSize === 'number') {
|
||||||
|
result.uploadChunkSize = copy.uploadChunkSize
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
core.debug(`Upload concurrency: ${result.uploadConcurrency}`)
|
||||||
|
core.debug(`Upload chunk size: ${result.uploadChunkSize}`)
|
||||||
|
|
||||||
|
return result
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns a copy of the download options with defaults filled in.
|
||||||
|
*
|
||||||
|
* @param copy the original download options
|
||||||
|
*/
|
||||||
|
export function getDownloadOptions(copy?: DownloadOptions): DownloadOptions {
|
||||||
|
const result: DownloadOptions = {
|
||||||
|
useAzureSdk: false,
|
||||||
|
concurrentBlobDownloads: true,
|
||||||
|
downloadConcurrency: 8,
|
||||||
|
timeoutInMs: 30000,
|
||||||
|
segmentTimeoutInMs: 600000,
|
||||||
|
lookupOnly: false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (copy) {
|
||||||
|
if (typeof copy.useAzureSdk === 'boolean') {
|
||||||
|
result.useAzureSdk = copy.useAzureSdk
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof copy.concurrentBlobDownloads === 'boolean') {
|
||||||
|
result.concurrentBlobDownloads = copy.concurrentBlobDownloads
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof copy.downloadConcurrency === 'number') {
|
||||||
|
result.downloadConcurrency = copy.downloadConcurrency
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof copy.timeoutInMs === 'number') {
|
||||||
|
result.timeoutInMs = copy.timeoutInMs
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof copy.segmentTimeoutInMs === 'number') {
|
||||||
|
result.segmentTimeoutInMs = copy.segmentTimeoutInMs
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof copy.lookupOnly === 'boolean') {
|
||||||
|
result.lookupOnly = copy.lookupOnly
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const segmentDownloadTimeoutMins =
|
||||||
|
process.env['SEGMENT_DOWNLOAD_TIMEOUT_MINS']
|
||||||
|
|
||||||
|
if (
|
||||||
|
segmentDownloadTimeoutMins &&
|
||||||
|
!isNaN(Number(segmentDownloadTimeoutMins)) &&
|
||||||
|
isFinite(Number(segmentDownloadTimeoutMins))
|
||||||
|
) {
|
||||||
|
result.segmentTimeoutInMs = Number(segmentDownloadTimeoutMins) * 60 * 1000
|
||||||
|
}
|
||||||
|
core.debug(`Use Azure SDK: ${result.useAzureSdk}`)
|
||||||
|
core.debug(`Download concurrency: ${result.downloadConcurrency}`)
|
||||||
|
core.debug(`Request timeout (ms): ${result.timeoutInMs}`)
|
||||||
|
core.debug(
|
||||||
|
`Cache segment download timeout mins env var: ${process.env['SEGMENT_DOWNLOAD_TIMEOUT_MINS']}`
|
||||||
|
)
|
||||||
|
core.debug(`Segment download timeout (ms): ${result.segmentTimeoutInMs}`)
|
||||||
|
core.debug(`Lookup only: ${result.lookupOnly}`)
|
||||||
|
|
||||||
|
return result
|
||||||
|
}
|
|
@ -0,0 +1,6 @@
|
||||||
|
import {restoreCache} from './cache'
|
||||||
|
|
||||||
|
restoreCache(['/home/runner/work/_temp'], 'cache-key', [
|
||||||
|
'cache-key-1',
|
||||||
|
'cache-key-2'
|
||||||
|
])
|
|
@ -0,0 +1,17 @@
|
||||||
|
{
|
||||||
|
"extends": "../../tsconfig.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"baseUrl": "./",
|
||||||
|
"outDir": "./lib",
|
||||||
|
"rootDir": "./src",
|
||||||
|
"lib": [
|
||||||
|
"es6",
|
||||||
|
"dom"
|
||||||
|
],
|
||||||
|
"useUnknownInCatchVariables": false,
|
||||||
|
"sourceMap": true,
|
||||||
|
},
|
||||||
|
"include": [
|
||||||
|
"./src"
|
||||||
|
]
|
||||||
|
}
|
Loading…
Reference in New Issue