b43b97985c
Co-authored-by: Bethany <bethanyj28@users.noreply.github.com> |
||
---|---|---|
.. | ||
__tests__ | ||
docs | ||
src | ||
CONTRIBUTIONS.md | ||
LICENSE.md | ||
README.md | ||
RELEASES.md | ||
package-lock.json | ||
package.json | ||
tsconfig.json |
README.md
@actions/artifact
Interact programmatically with Actions Artifacts.
This is the core library that powers the @actions/upload-artifact
and @actions/download-artifact
actions.
v2 - What's New
[!IMPORTANT] @actions/artifact v2+, download-artifact@v4+ download-artifact@v4+ are not currently supported on GHES yet. The previous version of this package can be found at this tag and on npm.
The release of @actions/artifact@v2
(including download-artifact@v4
and download-artifact@v4
) are major changes to the backend architecture of Artifacts. They have numerous performance and behavioral improvements.
Improvements
- All upload and download operations are much quicker, up to 80% faster download times and 96% faster upload times in worst case scenarios.
- Once uploaded, Artifacts are immediately available in the UI and REST API. Previously, you would have to wait for the run to be completed.
- Artifacts are immutable once they are uploaded. They cannot be altered by subsequent jobs. (Digest/integrity hash coming soon in the API!)
- This library (and
actions/download-artifact
) now support downloading Artifacts from other repositories and runs if aGITHUB_TOKEN
with sufficientactions:read
permissions are provided.
Breaking changes
-
Firewall rules required for self-hosted runners.
If you are using self-hosted runners behind a firewall, you must have flows open to Actions endpoints. If you cannot use wildcard rules for your firewall, see the GitHub meta endpoint for specific endpoints.
e.g.
curl https://api.github.com/meta | jq .domains.actions
-
Uploading to the same named Artifact multiple times.
Due to how Artifacts are created in this new version, it is no longer possible to upload to the same named Artifact multiple times. You must either split the uploads into multiple Artifacts with different names, or only upload once.
Quick Start
Install the package:
npm i @actions/artifact
Import the module:
// ES6 module
import artifact from '@actions/artifact'
// CommonJS
const {default: artifact} = require('@actions/artifact')
ℹ️ For a comprehensive list of classes, interfaces, functions and more, see the generated documentation.
Examples
Upload and Download
The most basic scenario is uploading one or more files to an Artifact, then downloading that Artifact. Downloads are based on the Artifact ID, which can be obtained in the response of uploadArtifact
, getArtifact
, listArtifacts
or via the REST API.
const {id, size} = await artifact.uploadArtifact(
// name of the artifact
'my-artifact',
// files to include (supports absolute and relative paths)
['/absolute/path/file1.txt', './relative/file2.txt'],
{
// optional: how long to retain the artifact
// if unspecified, defaults to repository/org retention settings (the limit of this value)
retentionDays: 10
}
)
console.log(`Created artifact with id: ${id} (bytes: ${size}`)
const {downloadPath} = await artifact.downloadArtifact(id, {
// optional: download destination path. otherwise defaults to $GITHUB_WORKSPACE
path: '/tmp/dst/path',
})
console.log(`Downloaded artifact ${id} to: ${downloadPath}`)
Downloading from other workflow runs or repos
It may be useful to download Artifacts from other workflow runs, or even other repositories. By default, the permissions are scoped so they can only download Artifacts within the current workflow run. To elevate permissions for this scenario, you must specify options.findBy
to downloadArtifact
.
const findBy = {
// must have actions:read permission on target repository
token: process.env['GITHUB_TOKEN'],
workflowRunId: 123,
repositoryOwner: 'actions',
repositoryName: 'toolkit'
}
await artifact.downloadArtifact(1337, {
findBy
})
// can also be used in other methods
await artifact.getArtifact('my-artifact', {
findBy
})
await artifact.listArtifacts({
findBy
})
Speeding up large uploads
If you have large files that need to be uploaded (or file types that don't compress well), you may benefit from changing the compression level of the Artifact archive. NOTE: This is a tradeoff between artifact upload time and stored data size.
await artifact.uploadArtifact('my-massive-artifact', ['big_file.bin'], {
// The level of compression for Zlib to be applied to the artifact archive.
// - 0: No compression
// - 1: Best speed
// - 6: Default compression (same as GNU Gzip)
// - 9: Best compression
compressionLevel: 0
})