Compare commits

...

34 Commits

Author SHA1 Message Date
Nathan Rajlich
45bd855250 Publish Stable
- @vercel/build-utils@5.5.4
 - vercel@28.4.7
 - @vercel/client@12.2.12
 - @vercel/go@2.2.12
 - @vercel/hydrogen@0.0.25
 - @vercel/next@3.2.3
 - @vercel/node@2.5.22
 - @vercel/python@3.1.21
 - @vercel/redwood@1.0.30
 - @vercel/remix@1.0.31
 - @vercel/ruby@1.3.38
 - @vercel/static-build@1.0.30
2022-10-05 12:41:42 -07:00
Nathan Rajlich
49de8ad9a0 [node] Update edge-runtime to v1.1.0-beta.37 (#8687)
Fixes error:

```
ENOENT: no such file or directory, open 'querystring'
```

Unfortunately this issue would only manifest when installed externally. I.e. our tests didn't catch this since the `querystring` module is presumably present in the monorepo.
2022-10-05 19:33:53 +00:00
JJ Kasper
a1ea56fd67 [next] Update max route src check for generated route (#8689)
### Related Issues

This reduces the max length we check for when generating routes to ensure we stay under the 4096 limit after normalizing. 

x-ref: https://github.com/vercel/customer-issues/issues/779

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-10-05 19:18:41 +00:00
JJ Kasper
e88addc9ed [next] Fix legacy pages/404 case (#8682)
### Related Issues

This ensures we handle the case were a lambda isn't present for `pages/404.js` with `getStaticProps` which can occur in older Next.js versions e.g. `v9.5.5`. This also adds a regression test for this specific version to ensure it is working as expected. 

x-ref: https://github.com/vercel/vercel/pull/8663
Fixes: [slack thread](https://vercel.slack.com/archives/C03DQ3QFV7C/p1664945825621409)

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-10-05 14:21:10 +00:00
Peter van der Zee
5d50013f93 [build-utils] Allow file-ref sema to be controlled through env flag (#8681)
My IDE tells me `process` is unknown but mentions something about package.json so that may just be a superficial issue. I guess CI/CD will tell me soon enough.

This adds an env flag to override the file ref sema's so we can experiment with setting a higher sema.

One potential problem I'm seeing is that this is a generic sema for all the things that use this class. Not sure if that's going to work out as intended but in that case we'll have to find a different way :)
2022-10-05 13:21:27 +00:00
Lee Robinson
44e1eb3983 Update CLI README (#8675)
:another-one:
2022-10-04 04:48:10 +00:00
Lee Robinson
f8af013349 Update README (#8674)
Small changes 😄
2022-10-04 04:13:41 +00:00
Steven
972cc495ec [tests] Replace cancel-workflow-action with native cancel-in-progress (#8671)
This removes the `styfle/cancel-workflow-action` in favor of native GitHub Actions `cancel-in-progress`.

The cancel key is workflow+branch but we don't want to cancel on the `main` branch.

https://docs.github.com/en/actions/using-jobs/using-concurrency
2022-10-03 23:49:28 +00:00
Steven
1c580da3d8 [cli] Fix vc build to error early when runtime is discontinued (#8669)
This moves an existing error from the build container to `vercel build`.

Its rare, but [Vercel Runtimes](https://vercel.com/docs/runtimes) might target a discontinued [AWS Lambda Runtime](https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html) so we should fail fast when we know this has happened in `vercel build`.

A test has been added to demonstrate the failure using an old PHP version.
2022-10-03 22:07:02 +00:00
Steven
244554ab1b [tests] Remove nodejs12.x tests (#8667)
Now that `nodejs12.x` has passed the sunset date, new deployments will fail so we need to update a few tests.

https://vercel.com/changelog/node-js-12-is-being-deprecated
2022-10-03 20:59:05 +00:00
Steven
053c185481 Publish Stable
- vercel@28.4.6
 - @vercel/client@12.2.11
 - @vercel/next@3.2.2
2022-10-03 10:07:07 -04:00
JJ Kasper
8805b586ea [next] Allow revalidating ISR 404 path itself (#8663)
### Related Issues

Fixes: https://vercel.slack.com/archives/C03S8ED1DKM/p1664521958768189

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-10-02 13:09:24 +00:00
Chris Barber
681070ffa0 [tests] Adding test for next builder OS path separator for serverless file refs (#8661)
Here's the test for https://github.com/vercel/vercel/pull/8657.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-09-30 18:29:39 +00:00
Chris Barber
362b17d60a [next] Use OS path separator to match serverless file references (#8657)
When running `vc build` for a Next.js app, the Next builder will execute the server build which performs several steps. One of the steps is to trace each serverless function for any referenced files, then the raw list of files is scrubbed and filtered. The filtering uses OS specific file path comparisons to see if a file is of interest. Since it's comparing OS specific paths, we need to use OS specific path separators.

During testing on Windows, the traced serverless functions file list was always empty.

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-09-30 15:25:33 +00:00
JJ Kasper
c7c9b1a791 [next] Update RSC header in has routes (#8651)
### Related Issues

x-ref: https://github.com/vercel/next.js/pull/40979

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR
2022-09-28 13:24:01 -07:00
Nathan Rajlich
c42f309463 [cli] Print upload progress in increments of 25% when non-TTY (#8650)
When running `vc deploy` in a non-TTY context (i.e. CI), limit the number of progress updates to 25% increments (for a total of 5).

```
Uploading [--------------------] (0.0B/71.9MB)
Uploading [=====---------------] (18.0MB/71.9MB)
Uploading [==========----------] (36.0MB/71.9MB)
Uploading [===============-----] (54.0MB/71.9MB)
Uploading [====================] (71.9MB/71.9MB)
```

This avoids spamming the user logs with many progress updates.
2022-09-28 19:33:33 +00:00
Sean Massa
a0ead28369 [tests] replace spinner messages with normal output during tests (#8634)
Convert spinner output to simple prints during test runs. This makes it easier to write tests against the output of commands.
2022-09-28 17:52:40 +00:00
chloetedder
8814fc1515 Publish Stable
- @vercel/build-utils@5.5.3
 - vercel@28.4.5
 - @vercel/client@12.2.10
 - @vercel/fs-detectors@3.4.1
 - @vercel/go@2.2.11
 - @vercel/hydrogen@0.0.24
 - @vercel/next@3.2.1
 - @vercel/node@2.5.21
 - @vercel/python@3.1.20
 - @vercel/redwood@1.0.29
 - @vercel/remix@1.0.30
 - @vercel/ruby@1.3.37
 - @vercel/static-build@1.0.29
2022-09-28 10:14:27 -05:00
chloetedder
0d044b4eac [fs-detectors] Use json5 parser for Rush to have valid json parsing (#8645)
### Related Issues

Parse `rush.json` files with `json5` because it is very common for these
to have comments in them

[Template for people to clone for
Rush](https://rushjs.io/pages/configs/rush_json/) which has comments in
it as a default which most people will clone
 
Docs of Rush showing to not use `JSON.parse`

https://rushjs.io/pages/help/faq/#why-do-rushs-json-config-files-contain--comments-that-github-shows-in-red

Added in tests with block comments and single line comments

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR

Co-authored-by: Sean Massa <EndangeredMassa@gmail.com>
2022-09-28 06:38:25 -05:00
Steven
f6bd1aa8c0 [tests] Remove console.log() from test (#8647)
Remove `console.log()` from test
2022-09-27 23:38:49 +00:00
Steven
8cd84ec066 Publish Stable
- @vercel/build-utils@5.5.2
 - vercel@28.4.4
 - @vercel/client@12.2.9
 - @vercel/go@2.2.10
 - @vercel/hydrogen@0.0.23
 - @vercel/next@3.2.0
 - @vercel/node@2.5.20
 - @vercel/python@3.1.19
 - @vercel/redwood@1.0.28
 - @vercel/remix@1.0.29
 - @vercel/ruby@1.3.36
 - @vercel/static-build@1.0.28
2022-09-27 16:35:00 -04:00
Steven
a8df231e4c [build-utils] Fix npm version detection for --legacy-peer-deps (#8646)
There was a case where the npm version wasn't decided base on Node.js version but instead based on the lockfile.
This PR fixes the case when a newer npm version is detected base on the lockfile.

- Follow up to #8598
- Follow up to #8550
2022-09-27 16:33:24 -04:00
Gal Schlezinger
f674842bed [next] read regions from middleware-manifest output (#8629)
### Related Issues

- Needs https://github.com/vercel/next.js/pull/40881 for this to be
useful
- Resolves EC-238

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR

Co-authored-by: Steven <steven@ceriously.com>
Co-authored-by: JJ Kasper <jj@jjsweb.site>
2022-09-27 10:46:02 -07:00
Steven
bf5cfa9a41 Publish Stable
- @vercel/build-utils@5.5.1
 - vercel@28.4.3
 - @vercel/client@12.2.8
 - @vercel/go@2.2.9
 - @vercel/hydrogen@0.0.22
 - @vercel/next@3.1.30
 - @vercel/node@2.5.19
 - @vercel/python@3.1.18
 - @vercel/redwood@1.0.27
 - @vercel/remix@1.0.28
 - @vercel/ruby@1.3.35
 - @vercel/static-build@1.0.27
2022-09-27 09:39:04 -04:00
JJ Kasper
12121b7a71 [next] Fix build time 404 route with trailingSlash: true (#8627)
This ensures we properly handle trailing slashes in the `notFound: true` build time routes. 

Fixes: [slack thread](https://vercel.slack.com/archives/C03S8ED1DKM/p1663691719703509)
2022-09-27 09:21:49 -04:00
Steven
baa56aed2c [tests] Fix timeout for actions/setup-node (#8639)
Try fixing the timeout again. For example: https://github.com/vercel/vercel/actions/runs/3130757219/jobs/5081381465

- Follow up to #8613 
- Related to https://github.com/actions/cache/issues/810 
- Related to https://github.com/Azure/azure-sdk-for-js/issues/22321
2022-09-26 18:33:43 -04:00
Steven
6f767367e4 [build-utils] Adjust nodejs12.x discontinueDate to Monday (#8638)
The previous date was on a Saturday so lets move it to the following Monday to ensure support tickets are quickly answered.
2022-09-26 22:22:32 +00:00
Sean Massa
0e4124f94c [cli] fix vc bisect off by one (#8399)
Co-authored-by: Nathan Rajlich <n@n8.io>
2022-09-26 14:47:15 -05:00
Sean Massa
30503d0a3f [go] remove unused, breaking watchlist from Go builder (#8633) 2022-09-26 14:31:36 -05:00
Steven
6c9164f67d [cli] Refactor doBuild to return void (#8626)
This PR refactor `doBuild()` to return `void`.

This will prevent accidental bugs like #8623 where an exit code number was returned instead of throwing on error.
2022-09-24 01:25:59 +00:00
Steven
906b7a8f2c [cli] Fix invalid vercel.json config error serialization (#8623)
Follow up to #8622 since we should be throwing errors so the error is correctly serialized to `builds.json`.
2022-09-23 22:08:53 +00:00
Steven
43499b13d8 [cli] Add vercel.json validation to vercel build (#8622)
We were doing this validation in `vercel dev` but not `vercel build`.

This PR adds `vercel.json` validation to `vercel build` too.

Note I am calling this a patch because invalid `vercel.json` was already failing when passed to the API so this allows a nice error message earlier in the process.
2022-09-23 20:33:28 +00:00
Sean Massa
7d6e56670f [cli][dev] Add strict mode to vc dev edge function handlers (#8616)
Add strict mode to `vc dev` edge function handlers. This is behind a flag in production, but that flag has been at 100% for a while. So, it seems safe to include it here unconditionally.

Also remove legal comments.

These changes bring `vc dev` edge function support closer to production.
2022-09-23 14:23:03 +00:00
Sean Massa
dba337f148 [cli][dev] extract edge/serverless handler logic into separate files (#8615) 2022-09-22 16:34:32 -05:00
99 changed files with 1355 additions and 849 deletions

View File

@@ -1,17 +0,0 @@
name: Cancel
on:
push:
branches:
- '**'
- '!main'
jobs:
cancel:
name: 'Cancel Previous Runs'
runs-on: ubuntu-latest
timeout-minutes: 2
steps:
- uses: styfle/cancel-workflow-action@0.9.1
with:
workflow_id: test.yml, test-integration-cli.yml, test-unit.yml
access_token: ${{ github.token }}

View File

@@ -37,8 +37,7 @@ jobs:
- name: Setup Node
if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }}
uses: actions/setup-node@v3
env:
SEGMENT_DOWNLOAD_TIMEOUT_MIN: 5 # https://github.com/actions/cache/issues/810
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with:
node-version: 14
cache: 'yarn'

View File

@@ -13,6 +13,10 @@ env:
TURBO_TEAM: 'vercel'
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ github.ref != 'refs/heads/main' }}
jobs:
test:
name: CLI
@@ -31,8 +35,7 @@ jobs:
with:
go-version: '1.13.15'
- uses: actions/setup-node@v3
env:
SEGMENT_DOWNLOAD_TIMEOUT_MIN: 5 # https://github.com/actions/cache/issues/810
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with:
node-version: ${{ matrix.node }}
cache: 'yarn'

View File

@@ -13,6 +13,10 @@ env:
TURBO_TEAM: 'vercel'
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ github.ref != 'refs/heads/main' }}
jobs:
test:
name: Unit
@@ -31,8 +35,7 @@ jobs:
with:
fetch-depth: 2
- uses: actions/setup-node@v3
env:
SEGMENT_DOWNLOAD_TIMEOUT_MIN: 5 # https://github.com/actions/cache/issues/810
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with:
node-version: ${{ matrix.node }}
cache: 'yarn'

View File

@@ -14,6 +14,10 @@ env:
TURBO_TEAM: 'vercel'
TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ github.ref != 'refs/heads/main' }}
jobs:
setup:
name: Find Changes
@@ -29,8 +33,7 @@ jobs:
with:
go-version: '1.13.15'
- uses: actions/setup-node@v3
env:
SEGMENT_DOWNLOAD_TIMEOUT_MIN: 5 # https://github.com/actions/cache/issues/810
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'yarn'
@@ -67,8 +70,7 @@ jobs:
with:
go-version: '1.13.15'
- uses: actions/setup-node@v3
env:
SEGMENT_DOWNLOAD_TIMEOUT_MIN: 5 # https://github.com/actions/cache/issues/810
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'yarn'

View File

@@ -380,8 +380,8 @@ This is a [class](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refere
This is an abstract enumeration type that is implemented by one of the following possible `String` values:
- `nodejs16.x`
- `nodejs14.x`
- `nodejs12.x`
- `go1.x`
- `java11`
- `python3.9`

View File

@@ -19,11 +19,9 @@
## Vercel
Vercel is a platform for **static sites and frontend frameworks**, built to integrate with your headless content, commerce, or database.
Vercel is the platform for frontend developers, providing the speed and reliability innovators need to create at the moment of inspiration.
We provide a **frictionless developer experience** to take care of the hard things: deploy instantly, scale automatically, and serve personalized content around the globe.
We make it easy for frontend teams to **develop, preview, and ship** delightful user experiences, where performance is the default.
We enable teams to iterate quickly and develop, preview, and ship delightful user experiences. Vercel has zero-configuration support for 35+ frontend frameworks and integrates with your headless content, commerce, or database of choice.
## Deploy

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/build-utils",
"version": "5.5.0",
"version": "5.5.4",
"license": "MIT",
"main": "./dist/index.js",
"types": "./dist/index.d.js",

View File

@@ -61,14 +61,14 @@ export function getPrettyError(obj: {
}
return new NowBuildError({
code: 'DEV_VALIDATE_CONFIG',
code: 'INVALID_VERCEL_CONFIG',
message: message,
link: prop ? `${docsUrl}#project/${prop.toLowerCase()}` : docsUrl,
action: 'View Documentation',
});
} catch (e) {
return new NowBuildError({
code: 'DEV_VALIDATE_CONFIG',
code: 'INVALID_VERCEL_CONFIG',
message: `Failed to validate configuration.`,
link: docsUrl,
action: 'View Documentation',

View File

@@ -5,7 +5,13 @@ import path from 'path';
import Sema from 'async-sema';
import { FileBase } from './types';
const semaToPreventEMFILE = new Sema(20);
const DEFAULT_SEMA = 20;
const semaToPreventEMFILE = new Sema(
parseInt(
process.env.VERCEL_INTERNAL_FILE_FS_REF_SEMA || String(DEFAULT_SEMA),
10
) || DEFAULT_SEMA
);
interface FileFsRefOptions {
mode?: number;

View File

@@ -12,7 +12,13 @@ interface FileRefOptions {
mutable?: boolean;
}
const semaToDownloadFromS3 = new Sema(5);
const DEFAULT_SEMA = 5;
const semaToDownloadFromS3 = new Sema(
parseInt(
process.env.VERCEL_INTERNAL_FILE_REF_SEMA || String(DEFAULT_SEMA),
10
) || DEFAULT_SEMA
);
class BailableError extends Error {
public bail: boolean;

View File

@@ -10,7 +10,7 @@ const allOptions = [
major: 12,
range: '12.x',
runtime: 'nodejs12.x',
discontinueDate: new Date('2022-10-01'),
discontinueDate: new Date('2022-10-03'),
},
{
major: 10,

View File

@@ -461,7 +461,8 @@ export async function runNpmInstall(
let commandArgs: string[];
const isPotentiallyBrokenNpm =
cliType === 'npm' &&
nodeVersion?.major === 16 &&
(nodeVersion?.major === 16 ||
opts.env.PATH?.includes('/node16/bin-npm7')) &&
!args.includes('--legacy-peer-deps') &&
spawnOpts?.env?.ENABLE_EXPERIMENTAL_COREPACK !== '1';

View File

@@ -0,0 +1,2 @@
node_modules
.vercel

View File

@@ -0,0 +1,64 @@
{
"requires": true,
"lockfileVersion": 1,
"dependencies": {
"js-tokens": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
"integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="
},
"loose-envify": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
"integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==",
"requires": {
"js-tokens": "^3.0.0 || ^4.0.0"
}
},
"object-assign": {
"version": "4.1.1",
"resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
"integrity": "sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg=="
},
"prop-types": {
"version": "15.8.1",
"resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz",
"integrity": "sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg==",
"requires": {
"loose-envify": "^1.4.0",
"object-assign": "^4.1.1",
"react-is": "^16.13.1"
}
},
"react": {
"version": "16.8.0",
"resolved": "https://registry.npmjs.org/react/-/react-16.8.0.tgz",
"integrity": "sha512-g+nikW2D48kqgWSPwNo0NH9tIGG3DsQFlrtrQ1kj6W77z5ahyIHG0w8kPpz4Sdj6gyLnz0lEd/xsjOoGge2MYQ==",
"requires": {
"loose-envify": "^1.1.0",
"object-assign": "^4.1.1",
"prop-types": "^15.6.2",
"scheduler": "^0.13.0"
}
},
"react-is": {
"version": "16.13.1",
"resolved": "https://registry.npmjs.org/react-is/-/react-is-16.13.1.tgz",
"integrity": "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ=="
},
"scheduler": {
"version": "0.13.6",
"resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.13.6.tgz",
"integrity": "sha512-IWnObHt413ucAYKsD9J1QShUKkbKLQQHdxRyw73sw4FN26iWr3DY/H34xGPe4nmL1DwXyWmSWmMrA9TfQbE/XQ==",
"requires": {
"loose-envify": "^1.1.0",
"object-assign": "^4.1.1"
}
},
"swr": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/swr/-/swr-1.3.0.tgz",
"integrity": "sha512-dkghQrOl2ORX9HYrMDtPa7LTVHJjCTeZoB1dqTbnnEDlSvN8JEKpYIYurDfvbQFUUS8Cg8PceFVZNkW0KNNYPw=="
}
}
}

View File

@@ -0,0 +1,11 @@
{
"private": true,
"scripts": {
"build": "mkdir -p public && echo 'legacy peer deps' > public/index.txt"
},
"packageManager": "npm@6.14.17",
"dependencies": {
"swr": "1.3.0",
"react": "16.8.0"
}
}

View File

@@ -0,0 +1,3 @@
{
"probes": [{ "path": "/", "mustContain": "legacy peer deps" }]
}

View File

@@ -34,7 +34,7 @@ it('should not include peer dependencies when missing VERCEL_NPM_LEGACY_PEER_DEP
const fixture = path.join(__dirname, 'fixtures', '20-npm-7');
const meta: Meta = {};
const spawnOpts = getTestSpawnOpts({});
const nodeVersion = { major: 16 } as any;
const nodeVersion = getNodeVersion(16);
await runNpmInstall(fixture, [], spawnOpts, meta, nodeVersion);
expect(spawnMock.mock.calls.length).toBe(1);
const args = spawnMock.mock.calls[0];
@@ -71,10 +71,35 @@ it('should include peer dependencies when VERCEL_NPM_LEGACY_PEER_DEPS=1 on node1
});
});
it('should not include peer dependencies when VERCEL_NPM_LEGACY_PEER_DEPS=1 on node14', async () => {
it('should include peer dependencies when VERCEL_NPM_LEGACY_PEER_DEPS=1 on node14 and npm7+', async () => {
const fixture = path.join(__dirname, 'fixtures', '20-npm-7');
const meta: Meta = {};
const spawnOpts = getTestSpawnOpts({ VERCEL_NPM_LEGACY_PEER_DEPS: '1' });
const nodeVersion = getNodeVersion(14);
await runNpmInstall(fixture, [], spawnOpts, meta, nodeVersion);
expect(spawnMock.mock.calls.length).toBe(1);
const args = spawnMock.mock.calls[0];
expect(args[0]).toEqual('npm');
expect(args[1]).toEqual([
'install',
'--no-audit',
'--unsafe-perm',
'--legacy-peer-deps',
]);
expect(args[2]).toEqual({
cwd: fixture,
prettyCommand: 'npm install',
stdio: 'inherit',
env: expect.any(Object),
});
});
it('should not include peer dependencies when VERCEL_NPM_LEGACY_PEER_DEPS=1 on node14 and npm6', async () => {
const fixture = path.join(__dirname, 'fixtures', '14-npm-6-legacy-peer-deps');
const meta: Meta = {};
const spawnOpts = getTestSpawnOpts({ VERCEL_NPM_LEGACY_PEER_DEPS: '1' });
const nodeVersion = getNodeVersion(14);
await runNpmInstall(fixture, [], spawnOpts, meta, nodeVersion);
expect(spawnMock.mock.calls.length).toBe(1);

View File

@@ -216,10 +216,6 @@ it('should download symlinks even with incorrect file', async () => {
});
it('should only match supported node versions, otherwise throw an error', async () => {
expect(await getSupportedNodeVersion('12.x', false)).toHaveProperty(
'major',
12
);
expect(await getSupportedNodeVersion('14.x', false)).toHaveProperty(
'major',
14
@@ -240,10 +236,6 @@ it('should only match supported node versions, otherwise throw an error', async
await expectBuilderError(getSupportedNodeVersion('foo', true), autoMessage);
await expectBuilderError(getSupportedNodeVersion('=> 10', true), autoMessage);
expect(await getSupportedNodeVersion('12.x', true)).toHaveProperty(
'major',
12
);
expect(await getSupportedNodeVersion('14.x', true)).toHaveProperty(
'major',
14
@@ -273,21 +265,21 @@ it('should only match supported node versions, otherwise throw an error', async
it('should match all semver ranges', async () => {
// See https://docs.npmjs.com/files/package.json#engines
expect(await getSupportedNodeVersion('12.0.0')).toHaveProperty('major', 12);
expect(await getSupportedNodeVersion('12.x')).toHaveProperty('major', 12);
expect(await getSupportedNodeVersion('14.0.0')).toHaveProperty('major', 14);
expect(await getSupportedNodeVersion('14.x')).toHaveProperty('major', 14);
expect(await getSupportedNodeVersion('>=10')).toHaveProperty('major', 16);
expect(await getSupportedNodeVersion('>=10.3.0')).toHaveProperty('major', 16);
expect(await getSupportedNodeVersion('11.5.0 - 12.5.0')).toHaveProperty(
expect(await getSupportedNodeVersion('16.5.0 - 16.9.0')).toHaveProperty(
'major',
12
16
);
expect(await getSupportedNodeVersion('>=9.5.0 <=12.5.0')).toHaveProperty(
expect(await getSupportedNodeVersion('>=9.5.0 <=14.5.0')).toHaveProperty(
'major',
12
14
);
expect(await getSupportedNodeVersion('~12.5.0')).toHaveProperty('major', 12);
expect(await getSupportedNodeVersion('^12.5.0')).toHaveProperty('major', 12);
expect(await getSupportedNodeVersion('12.5.0 - 14.5.0')).toHaveProperty(
expect(await getSupportedNodeVersion('~14.5.0')).toHaveProperty('major', 14);
expect(await getSupportedNodeVersion('^14.5.0')).toHaveProperty('major', 14);
expect(await getSupportedNodeVersion('14.5.0 - 14.20.0')).toHaveProperty(
'major',
14
);
@@ -434,8 +426,8 @@ it('should warn for deprecated versions, soon to be discontinued', async () => {
expect(warningMessages).toStrictEqual([
'Error: Node.js version 10.x has reached End-of-Life. Deployments created on or after 2021-04-20 will fail to build. Please set "engines": { "node": "16.x" } in your `package.json` file to use Node.js 16.',
'Error: Node.js version 10.x has reached End-of-Life. Deployments created on or after 2021-04-20 will fail to build. Please set Node.js Version to 16.x in your Project Settings to use Node.js 16.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-01 will fail to build. Please set "engines": { "node": "16.x" } in your `package.json` file to use Node.js 16.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-01 will fail to build. Please set Node.js Version to 16.x in your Project Settings to use Node.js 16.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-03 will fail to build. Please set "engines": { "node": "16.x" } in your `package.json` file to use Node.js 16.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-03 will fail to build. Please set Node.js Version to 16.x in your Project Settings to use Node.js 16.',
]);
global.Date.now = realDateNow;

View File

@@ -10,11 +10,9 @@
## Usage
Vercel is a platform for **static sites and frontend frameworks**, built to integrate with your headless content, commerce, or database.
Vercel is the platform for frontend developers, providing the speed and reliability innovators need to create at the moment of inspiration.
We provide a **frictionless developer experience** to take care of the hard things: deploy instantly, scale automatically, and serve personalized content around the globe.
We make it easy for frontend teams to **develop, preview, and ship** delightful user experiences, where performance is the default.
We enable teams to iterate quickly and develop, preview, and ship delightful user experiences. Vercel has zero-configuration support for 35+ frontend frameworks and integrates with your headless content, commerce, or database of choice.
To install the latest version of Vercel CLI, run this command:

View File

@@ -1,6 +1,6 @@
{
"name": "vercel",
"version": "28.4.2",
"version": "28.4.7",
"preferGlobal": true,
"license": "Apache-2.0",
"description": "The command-line interface for Vercel",
@@ -41,16 +41,16 @@
"node": ">= 14"
},
"dependencies": {
"@vercel/build-utils": "5.5.0",
"@vercel/go": "2.2.8",
"@vercel/hydrogen": "0.0.21",
"@vercel/next": "3.1.29",
"@vercel/node": "2.5.18",
"@vercel/python": "3.1.17",
"@vercel/redwood": "1.0.26",
"@vercel/remix": "1.0.27",
"@vercel/ruby": "1.3.34",
"@vercel/static-build": "1.0.26",
"@vercel/build-utils": "5.5.4",
"@vercel/go": "2.2.12",
"@vercel/hydrogen": "0.0.25",
"@vercel/next": "3.2.3",
"@vercel/node": "2.5.22",
"@vercel/python": "3.1.21",
"@vercel/redwood": "1.0.30",
"@vercel/remix": "1.0.31",
"@vercel/ruby": "1.3.38",
"@vercel/static-build": "1.0.30",
"update-notifier": "5.1.0"
},
"devDependencies": {
@@ -95,9 +95,9 @@
"@types/which": "1.3.2",
"@types/write-json-file": "2.2.1",
"@types/yauzl-promise": "2.1.0",
"@vercel/client": "12.2.7",
"@vercel/client": "12.2.12",
"@vercel/frameworks": "1.1.6",
"@vercel/fs-detectors": "3.4.0",
"@vercel/fs-detectors": "3.4.1",
"@vercel/fun": "1.0.4",
"@vercel/ncc": "0.24.0",
"@zeit/source-map-support": "0.6.2",

View File

@@ -2,7 +2,6 @@ import open from 'open';
import boxen from 'boxen';
import execa from 'execa';
import plural from 'pluralize';
import inquirer from 'inquirer';
import { resolve } from 'path';
import chalk, { Chalk } from 'chalk';
import { URLSearchParams, parse } from 'url';
@@ -150,7 +149,9 @@ export default async function main(client: Client): Promise<number> {
if (badDeployment) {
if (badDeployment instanceof Error) {
badDeployment.message += ` "${bad}"`;
badDeployment.message += ` when requesting bad deployment "${normalizeURL(
bad
)}"`;
output.prettyError(badDeployment);
return 1;
}
@@ -165,7 +166,9 @@ export default async function main(client: Client): Promise<number> {
if (goodDeployment) {
if (goodDeployment instanceof Error) {
goodDeployment.message += ` "${good}"`;
goodDeployment.message += ` when requesting good deployment "${normalizeURL(
good
)}"`;
output.prettyError(goodDeployment);
return 1;
}
@@ -226,7 +229,8 @@ export default async function main(client: Client): Promise<number> {
// If we have the "good" deployment in this chunk, then we're done
for (let i = 0; i < newDeployments.length; i++) {
if (newDeployments[i].url === good) {
newDeployments = newDeployments.slice(0, i + 1);
// grab all deployments up until the good one
newDeployments = newDeployments.slice(0, i);
next = undefined;
break;
}
@@ -316,7 +320,7 @@ export default async function main(client: Client): Promise<number> {
if (openEnabled) {
await open(testUrl);
}
const answer = await inquirer.prompt({
const answer = await client.prompt({
type: 'expand',
name: 'action',
message: 'Select an action:',

View File

@@ -3,6 +3,7 @@ import chalk from 'chalk';
import dotenv from 'dotenv';
import { join, normalize, relative, resolve } from 'path';
import {
getDiscontinuedNodeVersions,
normalizePath,
Files,
FileFsRef,
@@ -25,6 +26,7 @@ import {
MergeRoutesProps,
Route,
} from '@vercel/routing-utils';
import { fileNameSymbol } from '@vercel/client';
import type { VercelConfig } from '@vercel/client';
import pull from './pull';
@@ -54,6 +56,7 @@ import { importBuilders } from '../util/build/import-builders';
import { initCorepack, cleanupCorepack } from '../util/build/corepack';
import { sortBuilders } from '../util/build/sort-builders';
import { toEnumerableError } from '../util/error';
import { validateConfig } from '../util/validate-config';
type BuildResult = BuildResultV2 | BuildResultV3;
@@ -232,7 +235,8 @@ export default async function main(client: Client): Promise<number> {
process.env.VERCEL = '1';
process.env.NOW_BUILDER = '1';
return await doBuild(client, project, buildsJson, cwd, outputDir);
await doBuild(client, project, buildsJson, cwd, outputDir);
return 0;
} catch (err: any) {
output.prettyError(err);
@@ -265,23 +269,36 @@ async function doBuild(
buildsJson: BuildsManifest,
cwd: string,
outputDir: string
): Promise<number> {
): Promise<void> {
const { output } = client;
const workPath = join(cwd, project.settings.rootDirectory || '.');
// Load `package.json` and `vercel.json` files
const [pkg, vercelConfig] = await Promise.all([
const [pkg, vercelConfig, nowConfig] = await Promise.all([
readJSONFile<PackageJson>(join(workPath, 'package.json')),
readJSONFile<VercelConfig>(join(workPath, 'vercel.json')).then(
config => config || readJSONFile<VercelConfig>(join(workPath, 'now.json'))
),
readJSONFile<VercelConfig>(join(workPath, 'vercel.json')),
readJSONFile<VercelConfig>(join(workPath, 'now.json')),
]);
if (pkg instanceof CantParseJSONFile) throw pkg;
if (vercelConfig instanceof CantParseJSONFile) throw vercelConfig;
if (nowConfig instanceof CantParseJSONFile) throw nowConfig;
if (vercelConfig) {
vercelConfig[fileNameSymbol] = 'vercel.json';
} else if (nowConfig) {
nowConfig[fileNameSymbol] = 'now.json';
}
const localConfig = vercelConfig || nowConfig || {};
const validateError = validateConfig(localConfig);
if (validateError) {
throw validateError;
}
const projectSettings = {
...project.settings,
...pickOverrides(vercelConfig || {}),
...pickOverrides(localConfig),
};
// Get a list of source files
@@ -289,12 +306,12 @@ async function doBuild(
normalizePath(relative(workPath, f))
);
const routesResult = getTransformedRoutes(vercelConfig || {});
const routesResult = getTransformedRoutes(localConfig);
if (routesResult.error) {
throw routesResult.error;
}
if (vercelConfig?.builds && vercelConfig.functions) {
if (localConfig.builds && localConfig.functions) {
throw new NowBuildError({
code: 'bad_request',
message:
@@ -303,7 +320,7 @@ async function doBuild(
});
}
let builds = vercelConfig?.builds || [];
let builds = localConfig.builds || [];
let zeroConfigRoutes: Route[] = [];
let isZeroConfig = false;
@@ -318,7 +335,7 @@ async function doBuild(
// Detect the Vercel Builders that will need to be invoked
const detectedBuilders = await detectBuilders(files, pkg, {
...vercelConfig,
...localConfig,
projectSettings,
ignoreBuildScript: true,
featHandleMiss: true,
@@ -395,13 +412,10 @@ async function doBuild(
})
);
buildsJson.builds = Array.from(buildsJsonBuilds.values());
const buildsJsonPath = join(outputDir, 'builds.json');
const writeBuildsJsonPromise = fs.writeJSON(buildsJsonPath, buildsJson, {
await fs.writeJSON(join(outputDir, 'builds.json'), buildsJson, {
spaces: 2,
});
ops.push(writeBuildsJsonPromise);
// The `meta` config property is re-used for each Builder
// invocation so that Builders can share state between
// subsequent entrypoint builds.
@@ -454,6 +468,25 @@ async function doBuild(
);
const buildResult = await builder.build(buildOptions);
if (
buildResult &&
'output' in buildResult &&
'runtime' in buildResult.output &&
'type' in buildResult.output &&
buildResult.output.type === 'Lambda'
) {
const lambdaRuntime = buildResult.output.runtime;
if (
getDiscontinuedNodeVersions().some(o => o.runtime === lambdaRuntime)
) {
throw new NowBuildError({
code: 'NODEJS_DISCONTINUED_VERSION',
message: `The Runtime "${build.use}" is using "${lambdaRuntime}", which is discontinued. Please upgrade your Runtime to a more recent version or consult the author for more details.`,
link: 'https://github.com/vercel/vercel/blob/main/DEVELOPING_A_RUNTIME.md#lambdaruntime',
});
}
}
// Store the build result to generate the final `config.json` after
// all builds have completed
buildResults.set(build, buildResult);
@@ -466,7 +499,7 @@ async function doBuild(
build,
builder,
builderPkg,
vercelConfig
localConfig
).then(
override => {
if (override) overrides.push(override);
@@ -475,26 +508,11 @@ async function doBuild(
)
);
} catch (err: any) {
output.prettyError(err);
const writeConfigJsonPromise = fs.writeJSON(
join(outputDir, 'config.json'),
{ version: 3 },
{ spaces: 2 }
);
await Promise.all([writeBuildsJsonPromise, writeConfigJsonPromise]);
const buildJsonBuild = buildsJsonBuilds.get(build);
if (buildJsonBuild) {
buildJsonBuild.error = toEnumerableError(err);
await fs.writeJSON(buildsJsonPath, buildsJson, {
spaces: 2,
});
}
return 1;
throw err;
}
}
@@ -555,150 +573,7 @@ async function doBuild(
builds: builderRoutes,
});
const images = vercelConfig?.images
if (images) {
if (typeof images !== 'object') {
throw new Error(
`vercel.json "images" should be an object received ${typeof images}.`
);
}
if (!Array.isArray(images.domains)) {
throw new Error(
`vercel.json "images.domains" should be an Array received ${typeof images.domains}.`
);
}
if (images.domains.length > 50) {
throw new Error(
`vercel.json "images.domains" exceeds length of 50 received length (${images.domains.length}).`
);
}
const invalidImageDomains = images.domains.filter(
(d: unknown) => typeof d !== 'string'
);
if (invalidImageDomains.length > 0) {
throw new Error(
`vercel.json "images.domains" should be an Array of strings received invalid values (${invalidImageDomains.join(
', '
)}).`
);
}
if (images.remotePatterns) {
if (!Array.isArray(images.remotePatterns)) {
throw new Error(
`vercel.json "images.remotePatterns" should be an Array received ${typeof images.remotePatterns}.`
);
}
if (images.remotePatterns.length > 50) {
throw new Error(
`vercel.json "images.remotePatterns" exceeds length of 50, received length (${images.remotePatterns.length}).`
);
}
const validProps = new Set(['protocol', 'hostname', 'pathname', 'port']);
const requiredProps = ['hostname'];
const invalidPatterns = images.remotePatterns.filter(
(d: unknown) =>
!d ||
typeof d !== 'object' ||
Object.entries(d).some(
([k, v]) => !validProps.has(k) || typeof v !== 'string'
) ||
requiredProps.some(k => !(k in d))
);
if (invalidPatterns.length > 0) {
throw new Error(
`vercel.json "images.remotePatterns" received invalid values:\n${invalidPatterns
.map(item => JSON.stringify(item))
.join(
'\n'
)}\n\nremotePatterns value must follow format { protocol: 'https', hostname: 'example.com', port: '', pathname: '/imgs/**' }.`
);
}
}
if (!Array.isArray(images.sizes)) {
throw new Error(
`vercel.json "images.sizes" should be an Array received ${typeof images.sizes}.`
);
}
if (images.sizes.length < 1 || images.sizes.length > 50) {
throw new Error(
`vercel.json "images.sizes" should be an Array of length between 1 to 50 received length (${images.sizes.length}).`
);
}
const invalidImageSizes = images.sizes.filter((d: unknown) => {
return typeof d !== 'number' || d < 1 || d > 10000;
});
if (invalidImageSizes.length > 0) {
throw new Error(
`vercel.json "images.sizes" should be an Array of numbers that are between 1 and 10000, received invalid values (${invalidImageSizes.join(
', '
)}).`
);
}
if (images.minimumCacheTTL) {
if (
!Number.isInteger(images.minimumCacheTTL) ||
images.minimumCacheTTL < 0
) {
throw new Error(
`vercel.json "images.minimumCacheTTL" should be an integer 0 or more received (${images.minimumCacheTTL}).`
);
}
}
if (images.formats) {
if (!Array.isArray(images.formats)) {
throw new Error(
`vercel.json "images.formats" should be an Array received ${typeof images.formats}.`
);
}
if (images.formats.length < 1 || images.formats.length > 2) {
throw new Error(
`vercel.json "images.formats" must be length 1 or 2, received length (${images.formats.length}).`
);
}
const invalid = images.formats.filter(f => {
return f !== 'image/avif' && f !== 'image/webp';
});
if (invalid.length > 0) {
throw new Error(
`vercel.json "images.formats" should be an Array of mime type strings, received invalid values (${invalid.join(
', '
)}).`
);
}
}
if (
typeof images.dangerouslyAllowSVG !== 'undefined' &&
typeof images.dangerouslyAllowSVG !== 'boolean'
) {
throw new Error(
`vercel.json "images.dangerouslyAllowSVG" should be a boolean received (${images.dangerouslyAllowSVG}).`
);
}
if (
typeof images.contentSecurityPolicy !== 'undefined' &&
typeof images.contentSecurityPolicy !== 'string'
) {
throw new Error(
`vercel.json "images.contentSecurityPolicy" should be a string received ${images.contentSecurityPolicy}`
);
}
}
const mergedImages = mergeImages(images, buildResults.values());
const mergedImages = mergeImages(localConfig.images, buildResults.values());
const mergedWildcard = mergeWildcard(buildResults.values());
const mergedOverrides: Record<string, PathOverride> =
overrides.length > 0 ? Object.assign({}, ...overrides) : undefined;
@@ -724,8 +599,6 @@ async function doBuild(
emoji('success')
)}\n`
);
return 0;
}
function expandBuild(files: string[], build: Builder): Builder[] {

View File

@@ -23,6 +23,7 @@ import type {
} from '../types';
import { sharedPromise } from './promise';
import { APIError } from './errors-ts';
import { normalizeError } from './is-error';
const isSAMLError = (v: any): v is SAMLError => {
return v && v.saml;
@@ -146,10 +147,15 @@ export default class Client extends EventEmitter implements Stdio {
const error = await responseError(res);
if (isSAMLError(error)) {
// A SAML error means the token is expired, or is not
// designated for the requested team, so the user needs
// to re-authenticate
await this.reauthenticate(error);
try {
// A SAML error means the token is expired, or is not
// designated for the requested team, so the user needs
// to re-authenticate
await this.reauthenticate(error);
} catch (reauthError) {
// there's no sense in retrying
return bail(normalizeError(reauthError));
}
} else if (res.status >= 400 && res.status < 500) {
// Any other 4xx should bail without retrying
return bail(error);
@@ -186,7 +192,7 @@ export default class Client extends EventEmitter implements Stdio {
`Failed to re-authenticate for ${bold(error.scope)} scope`
);
}
process.exit(1);
throw error;
}
this.authConfig.token = result.token;

View File

@@ -115,29 +115,39 @@ export default async function processDeployment({
.reduce((a: number, b: number) => a + b, 0);
const totalSizeHuman = bytes.format(missingSize, { decimalPlaces: 1 });
uploads.forEach((e: any) =>
e.on('progress', () => {
const uploadedBytes = uploads.reduce((acc: number, e: any) => {
return acc + e.bytesUploaded;
}, 0);
// When stderr is not a TTY then we only want to
// print upload progress in 25% increments
let nextStep = 0;
const stepSize = now._client.stderr.isTTY ? 0 : 0.25;
const bar = progress(uploadedBytes, missingSize);
if (!bar || uploadedBytes === missingSize) {
output.spinner(deployingSpinnerVal, 0);
} else {
const uploadedHuman = bytes.format(uploadedBytes, {
decimalPlaces: 1,
fixedDecimals: true,
});
const updateProgress = () => {
const uploadedBytes = uploads.reduce((acc: number, e: any) => {
return acc + e.bytesUploaded;
}, 0);
const bar = progress(uploadedBytes, missingSize);
if (!bar) {
output.spinner(deployingSpinnerVal, 0);
} else {
const uploadedHuman = bytes.format(uploadedBytes, {
decimalPlaces: 1,
fixedDecimals: true,
});
const percent = uploadedBytes / missingSize;
if (percent >= nextStep) {
output.spinner(
`Uploading ${chalk.reset(
`[${bar}] (${uploadedHuman}/${totalSizeHuman})`
)}`,
0
);
nextStep += stepSize;
}
})
);
}
};
uploads.forEach((e: any) => e.on('progress', updateProgress));
updateProgress();
}
if (event.type === 'file-uploaded') {

View File

@@ -57,7 +57,7 @@ import { MissingDotenvVarsError } from '../errors-ts';
import cliPkg from '../pkg';
import { getVercelDirectory } from '../projects/link';
import { staticFiles as getFiles } from '../get-files';
import { validateConfig } from './validate';
import { validateConfig } from '../validate-config';
import { devRouter, getRoutesTypes } from './router';
import getMimeType from './mime-type';
import { executeBuild, getBuildMatches, shutdownBuilder } from './builder';

View File

@@ -4,6 +4,8 @@ import wait, { StopSpinner } from './wait';
import type { WritableTTY } from '../../types';
import { errorToString } from '../is-error';
const IS_TEST = process.env.NODE_ENV === 'test';
export interface OutputOptions {
debug?: boolean;
}
@@ -108,12 +110,15 @@ export class Output {
};
spinner = (message: string, delay: number = 300): void => {
this.spinnerMessage = message;
if (this.debugEnabled) {
this.debug(`Spinner invoked (${message}) with a ${delay}ms delay`);
return;
}
if (this.stream.isTTY) {
if (IS_TEST || !this.stream.isTTY) {
this.print(`${message}\n`);
} else {
this.spinnerMessage = message;
if (this._spinner) {
this._spinner.text = message;
} else {
@@ -125,8 +130,6 @@ export class Output {
delay
);
}
} else {
this.print(`${message}\n`);
}
};

View File

@@ -7,7 +7,7 @@ import {
rewritesSchema,
trailingSlashSchema,
} from '@vercel/routing-utils';
import { VercelConfig } from './types';
import { VercelConfig } from './dev/types';
import {
functionsSchema,
buildsSchema,
@@ -16,6 +16,83 @@ import {
} from '@vercel/build-utils';
import { fileNameSymbol } from '@vercel/client';
const imagesSchema = {
type: 'object',
additionalProperties: false,
required: ['sizes'],
properties: {
contentSecurityPolicy: {
type: 'string',
minLength: 1,
maxLength: 256,
},
dangerouslyAllowSVG: {
type: 'boolean',
},
domains: {
type: 'array',
minItems: 0,
maxItems: 50,
items: {
type: 'string',
minLength: 1,
maxLength: 256,
},
},
formats: {
type: 'array',
minItems: 1,
maxItems: 4,
items: {
enum: ['image/avif', 'image/webp', 'image/jpeg', 'image/png'],
},
},
minimumCacheTTL: {
type: 'integer',
minimum: 1,
maximum: 315360000,
},
remotePatterns: {
type: 'array',
minItems: 0,
maxItems: 50,
items: {
type: 'object',
additionalProperties: false,
required: ['hostname'],
properties: {
protocol: {
enum: ['http', 'https'],
},
hostname: {
type: 'string',
minLength: 1,
maxLength: 256,
},
port: {
type: 'string',
minLength: 1,
maxLength: 5,
},
pathname: {
type: 'string',
minLength: 1,
maxLength: 256,
},
},
},
},
sizes: {
type: 'array',
minItems: 1,
maxItems: 50,
items: {
type: 'number',
},
},
},
};
const vercelConfigSchema = {
type: 'object',
// These are not all possibilities because `vc dev`
@@ -30,6 +107,7 @@ const vercelConfigSchema = {
rewrites: rewritesSchema,
trailingSlash: trailingSlashSchema,
functions: functionsSchema,
images: imagesSchema,
},
};

View File

@@ -0,0 +1,2 @@
.vercel/builders
.vercel/output

View File

@@ -0,0 +1,7 @@
{
"orgId": ".",
"projectId": ".",
"settings": {
"framework": null
}
}

View File

@@ -0,0 +1 @@
<?php echo 'This version of vercel-php uses the nodejs12.x Lambda Runtime'; ?>

View File

@@ -0,0 +1,8 @@
{
"$schema": "https://openapi.vercel.sh/vercel.json",
"functions": {
"api/index.php": {
"runtime": "vercel-php@0.1.0"
}
}
}

View File

@@ -3,6 +3,6 @@
"sizes": [256, 384, 600, 1000],
"domains": [],
"minimumCacheTTL": 60,
"formats": ["image/webp", "image/avif"]
"formats": ["image/avif", "image/webp"]
}
}

View File

@@ -0,0 +1,7 @@
{
"orgId": ".",
"projectId": ".",
"settings": {
"framework": null
}
}

View File

@@ -0,0 +1 @@
<h1>Vercel</h1>

View File

@@ -0,0 +1,16 @@
{
"rewrites": [
{
"source": "/one",
"destination": "/api/one"
},
{
"source": "/two",
"destination": "/api/two"
},
{
"src": "/three",
"dest": "/api/three"
}
]
}

View File

@@ -326,7 +326,7 @@ module.exports = async function prepare(session, binaryPath, tmpFixturesDir) {
'vercel.json': JSON.stringify({
functions: {
'api/**/*.php': {
runtime: 'vercel-php@0.1.0',
runtime: 'vercel-php@0.5.2',
},
},
}),

View File

@@ -3,20 +3,26 @@ import chance from 'chance';
import { Deployment } from '@vercel/client';
import { client } from './client';
import { Build, User } from '../../src/types';
import type { Request, Response } from 'express';
let deployments = new Map<string, Deployment>();
let deploymentBuilds = new Map<Deployment, Build[]>();
let alreadySetupDeplomentEndpoints = false;
type State = Deployment['readyState'];
export function useDeployment({
creator,
state = 'READY',
createdAt,
}: {
creator: Pick<User, 'id' | 'email' | 'name' | 'username'>;
state?: State;
createdAt?: number;
}) {
const createdAt = Date.now();
setupDeploymentEndpoints();
createdAt = createdAt || Date.now();
const url = new URL(chance().url());
const name = chance().name();
const id = `dpl_${chance().guid()}`;
@@ -99,6 +105,15 @@ export function useDeploymentMissingProjectSettings() {
beforeEach(() => {
deployments = new Map();
deploymentBuilds = new Map();
alreadySetupDeplomentEndpoints = false;
});
function setupDeploymentEndpoints() {
if (alreadySetupDeplomentEndpoints) {
return;
}
alreadySetupDeplomentEndpoints = true;
client.scenario.get('/:version/deployments/:id', (req, res) => {
const { id } = req.params;
@@ -136,8 +151,21 @@ beforeEach(() => {
res.json({ builds });
});
client.scenario.get('/:version/now/deployments', (req, res) => {
const deploymentsList = Array.from(deployments.values());
res.json({ deployments: deploymentsList });
});
});
function handleGetDeployments(req: Request, res: Response) {
const currentDeployments = Array.from(deployments.values()).sort(
(a: Deployment, b: Deployment) => {
// sort in reverse chronological order
return b.createdAt - a.createdAt;
}
);
res.json({
pagination: {
count: currentDeployments.length,
},
deployments: currentDeployments,
});
}
client.scenario.get('/:version/now/deployments', handleGetDeployments);
client.scenario.get('/:version/deployments', handleGetDeployments);
}

View File

@@ -0,0 +1,55 @@
import { client } from '../../mocks/client';
import { useUser } from '../../mocks/user';
import bisect from '../../../src/commands/bisect';
import { useDeployment } from '../../mocks/deployment';
describe('bisect', () => {
it('should find the bad deployment', async () => {
const user = useUser();
const now = Date.now();
const deployment1 = useDeployment({ creator: user, createdAt: now });
const deployment2 = useDeployment({
creator: user,
createdAt: now + 10000,
});
const deployment3 = useDeployment({
creator: user,
createdAt: now + 20000,
});
// also create an extra deployment before the known good deployment
// to make sure the bisect pool doesn't include it
useDeployment({
creator: user,
createdAt: now - 30000,
});
const bisectPromise = bisect(client);
await expect(client.stderr).toOutput('Specify a URL where the bug occurs:');
client.stdin.write(`https://${deployment3.url}\n`);
await expect(client.stderr).toOutput(
'Specify a URL where the bug does not occur:'
);
client.stdin.write(`https://${deployment1.url}\n`);
await expect(client.stderr).toOutput(
'Specify the URL subpath where the bug occurs:'
);
client.stdin.write('/docs\n');
await expect(client.stderr).toOutput('Bisecting');
await expect(client.stderr).toOutput(
`Deployment URL: https://${deployment2.url}`
);
client.stdin.write('b\n');
await expect(client.stderr).toOutput(
`The first bad deployment is: https://${deployment2.url}`
);
await expect(bisectPromise).resolves.toEqual(0);
});
});

View File

@@ -750,11 +750,71 @@ describe('build', () => {
const errorBuilds = builds.builds.filter((b: any) => 'error' in b);
expect(errorBuilds).toHaveLength(1);
expect(errorBuilds[0].error.name).toEqual('Error');
expect(errorBuilds[0].error.message).toMatch(`TS1005`);
expect(errorBuilds[0].error.message).toMatch(`',' expected.`);
expect(errorBuilds[0].error.hideStackTrace).toEqual(true);
expect(errorBuilds[0].error.code).toEqual('NODE_TYPESCRIPT_ERROR');
expect(errorBuilds[0].error).toEqual({
name: 'Error',
message: expect.stringContaining('TS1005'),
stack: expect.stringContaining('api/typescript.ts'),
hideStackTrace: true,
code: 'NODE_TYPESCRIPT_ERROR',
});
// top level "error" also contains the same error
expect(builds.error).toEqual({
name: 'Error',
message: expect.stringContaining('TS1005'),
stack: expect.stringContaining('api/typescript.ts'),
hideStackTrace: true,
code: 'NODE_TYPESCRIPT_ERROR',
});
// `config.json` contains `version`
const configJson = await fs.readJSON(join(output, 'config.json'));
expect(configJson.version).toBe(3);
} finally {
process.chdir(originalCwd);
delete process.env.__VERCEL_BUILD_RUNNING;
}
});
it('should error when "functions" has runtime that emits discontinued "nodejs12.x"', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on Windows');
return;
}
const cwd = fixture('discontinued-nodejs12.x');
const output = join(cwd, '.vercel/output');
try {
process.chdir(cwd);
const exitCode = await build(client);
expect(exitCode).toEqual(1);
// Error gets printed to the terminal
await expect(client.stderr).toOutput(
'The Runtime "vercel-php@0.1.0" is using "nodejs12.x", which is discontinued. Please upgrade your Runtime to a more recent version or consult the author for more details.'
);
// `builds.json` contains "error" build
const builds = await fs.readJSON(join(output, 'builds.json'));
const errorBuilds = builds.builds.filter((b: any) => 'error' in b);
expect(errorBuilds).toHaveLength(1);
expect(errorBuilds[0].error).toEqual({
name: 'Error',
message: expect.stringContaining('Please upgrade your Runtime'),
stack: expect.stringContaining('Please upgrade your Runtime'),
hideStackTrace: true,
code: 'NODEJS_DISCONTINUED_VERSION',
link: 'https://github.com/vercel/vercel/blob/main/DEVELOPING_A_RUNTIME.md#lambdaruntime',
});
// top level "error" also contains the same error
expect(builds.error).toEqual({
name: 'Error',
message: expect.stringContaining('Please upgrade your Runtime'),
stack: expect.stringContaining('Please upgrade your Runtime'),
hideStackTrace: true,
code: 'NODEJS_DISCONTINUED_VERSION',
link: 'https://github.com/vercel/vercel/blob/main/DEVELOPING_A_RUNTIME.md#lambdaruntime',
});
// `config.json` contains `version`
const configJson = await fs.readJSON(join(output, 'config.json'));
@@ -920,7 +980,7 @@ describe('build', () => {
delete process.env.__VERCEL_BUILD_RUNNING;
}
});
it('should apply "images" configuration from `vercel.json`', async () => {
const cwd = fixture('images');
const output = join(cwd, '.vercel/output');
@@ -936,7 +996,7 @@ describe('build', () => {
sizes: [256, 384, 600, 1000],
domains: [],
minimumCacheTTL: 60,
formats: ['image/webp', 'image/avif'],
formats: ['image/avif', 'image/webp'],
},
});
} finally {
@@ -945,6 +1005,38 @@ describe('build', () => {
}
});
it('should fail with invalid "rewrites" configuration from `vercel.json`', async () => {
const cwd = fixture('invalid-rewrites');
const output = join(cwd, '.vercel/output');
try {
process.chdir(cwd);
const exitCode = await build(client);
expect(exitCode).toEqual(1);
await expect(client.stderr).toOutput(
'Error: Invalid vercel.json - `rewrites[2]` should NOT have additional property `src`. Did you mean `source`?' +
'\n' +
'View Documentation: https://vercel.com/docs/configuration#project/rewrites'
);
const builds = await fs.readJSON(join(output, 'builds.json'));
expect(builds.builds).toBeUndefined();
expect(builds.error).toEqual({
name: 'Error',
message:
'Invalid vercel.json - `rewrites[2]` should NOT have additional property `src`. Did you mean `source`?',
stack: expect.stringContaining('at validateConfig'),
hideStackTrace: true,
code: 'INVALID_VERCEL_CONFIG',
link: 'https://vercel.com/docs/configuration#project/rewrites',
action: 'View Documentation',
});
const configJson = await fs.readJSON(join(output, 'config.json'));
expect(configJson.version).toBe(3);
} finally {
process.chdir(originalCwd);
delete process.env.__VERCEL_BUILD_RUNNING;
}
});
describe('should find packages with different main/module/browser keys', function () {
let output: string;

View File

@@ -1,4 +1,7 @@
import bytes from 'bytes';
import fs from 'fs-extra';
import { join } from 'path';
import { randomBytes } from 'crypto';
import { fileNameSymbol } from '@vercel/client';
import { client } from '../../mocks/client';
import deploy from '../../../src/commands/deploy';
@@ -199,4 +202,119 @@ describe('deploy', () => {
process.chdir(originalCwd);
}
});
it('should upload missing files', async () => {
const cwd = setupFixture('commands/deploy/archive');
const originalCwd = process.cwd();
// Add random 1mb file
await fs.writeFile(join(cwd, 'data'), randomBytes(bytes('1mb')));
try {
process.chdir(cwd);
const user = useUser();
useTeams('team_dummy');
useProject({
...defaultProject,
name: 'archive',
id: 'archive',
});
let body: any;
let fileUploaded = false;
client.scenario.post(`/v13/deployments`, (req, res) => {
if (fileUploaded) {
body = req.body;
res.json({
creator: {
uid: user.id,
username: user.username,
},
id: 'dpl_archive_test',
});
} else {
const sha = req.body.files[0].sha;
res.status(400).json({
error: {
code: 'missing_files',
message: 'Missing files',
missing: [sha],
},
});
}
});
client.scenario.post('/v2/files', (req, res) => {
// Wait for file to be finished uploading
req.on('data', () => {
// Noop
});
req.on('end', () => {
fileUploaded = true;
res.end();
});
});
client.scenario.get(`/v13/deployments/dpl_archive_test`, (req, res) => {
res.json({
creator: {
uid: user.id,
username: user.username,
},
id: 'dpl_archive_test',
readyState: 'READY',
aliasAssigned: true,
alias: [],
});
});
client.scenario.get(
`/v10/now/deployments/dpl_archive_test`,
(req, res) => {
res.json({
creator: {
uid: user.id,
username: user.username,
},
id: 'dpl_archive_test',
readyState: 'READY',
aliasAssigned: true,
alias: [],
});
}
);
// When stderr is not a TTY we expect 5 progress lines to be printed
client.stderr.isTTY = false;
client.setArgv('deploy', '--archive=tgz');
const uploadingLines: string[] = [];
client.stderr.on('data', data => {
if (data.startsWith('Uploading [')) {
uploadingLines.push(data);
}
});
client.stderr.resume();
const exitCode = await deploy(client);
expect(exitCode).toEqual(0);
expect(body?.files?.length).toEqual(1);
expect(body?.files?.[0].file).toEqual('.vercel/source.tgz');
expect(uploadingLines.length).toEqual(5);
expect(
uploadingLines[0].startsWith('Uploading [--------------------]')
).toEqual(true);
expect(
uploadingLines[1].startsWith('Uploading [=====---------------]')
).toEqual(true);
expect(
uploadingLines[2].startsWith('Uploading [==========----------]')
).toEqual(true);
expect(
uploadingLines[3].startsWith('Uploading [===============-----]')
).toEqual(true);
expect(
uploadingLines[4].startsWith('Uploading [====================]')
).toEqual(true);
} finally {
process.chdir(originalCwd);
}
});
});

View File

@@ -38,11 +38,11 @@ describe('list', () => {
await list(client);
const output = await readOutputStream(client, 4);
const output = await readOutputStream(client, 6);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[0]);
const header: string[] = parseSpacedTableRow(output.split('\n')[3]);
const data: string[] = parseSpacedTableRow(output.split('\n')[4]);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[2]);
const header: string[] = parseSpacedTableRow(output.split('\n')[5]);
const data: string[] = parseSpacedTableRow(output.split('\n')[6]);
data.shift();
expect(org).toEqual(team[0].slug);
@@ -81,11 +81,11 @@ describe('list', () => {
client.setArgv('-S', user.username);
await list(client);
const output = await readOutputStream(client, 4);
const output = await readOutputStream(client, 6);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[0]);
const header: string[] = parseSpacedTableRow(output.split('\n')[3]);
const data: string[] = parseSpacedTableRow(output.split('\n')[4]);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[2]);
const header: string[] = parseSpacedTableRow(output.split('\n')[5]);
const data: string[] = parseSpacedTableRow(output.split('\n')[6]);
data.shift();
expect(org).toEqual(user.username);
@@ -116,11 +116,11 @@ describe('list', () => {
client.setArgv(deployment.name);
await list(client);
const output = await readOutputStream(client, 4);
const output = await readOutputStream(client, 6);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[0]);
const header: string[] = parseSpacedTableRow(output.split('\n')[3]);
const data: string[] = parseSpacedTableRow(output.split('\n')[4]);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[2]);
const header: string[] = parseSpacedTableRow(output.split('\n')[5]);
const data: string[] = parseSpacedTableRow(output.split('\n')[6]);
data.shift();
expect(org).toEqual(teamSlug || team[0].slug);

View File

@@ -22,10 +22,10 @@ describe('project', () => {
client.setArgv('project', 'ls');
await projects(client);
const output = await readOutputStream(client, 2);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[0]);
const header: string[] = parseSpacedTableRow(output.split('\n')[2]);
const data: string[] = parseSpacedTableRow(output.split('\n')[3]);
const output = await readOutputStream(client, 3);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[1]);
const header: string[] = parseSpacedTableRow(output.split('\n')[3]);
const data: string[] = parseSpacedTableRow(output.split('\n')[4]);
data.pop();
expect(org).toEqual(user.username);
@@ -47,10 +47,10 @@ describe('project', () => {
client.setArgv('project', 'ls');
await projects(client);
const output = await readOutputStream(client, 2);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[0]);
const header: string[] = parseSpacedTableRow(output.split('\n')[2]);
const data: string[] = parseSpacedTableRow(output.split('\n')[3]);
const output = await readOutputStream(client, 3);
const { org } = pluckIdentifiersFromDeploymentList(output.split('\n')[1]);
const header: string[] = parseSpacedTableRow(output.split('\n')[3]);
const data: string[] = parseSpacedTableRow(output.split('\n')[4]);
data.pop();
expect(org).toEqual(user.username);

View File

@@ -1,4 +1,4 @@
import { validateConfig } from '../../../../src/util/dev/validate';
import { validateConfig } from '../../../../src/util/validate-config';
describe('validateConfig', () => {
it('should not error with empty config', async () => {

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/client",
"version": "12.2.7",
"version": "12.2.12",
"main": "dist/index.js",
"typings": "dist/index.d.ts",
"homepage": "https://vercel.com",
@@ -43,7 +43,7 @@
]
},
"dependencies": {
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"@vercel/routing-utils": "2.0.2",
"@zeit/fetch": "5.2.0",
"async-retry": "1.2.3",

View File

@@ -1,4 +1,5 @@
import { Agent } from 'https';
import http from 'http';
import https from 'https';
import { Readable } from 'stream';
import { EventEmitter } from 'events';
import retry from 'async-retry';
@@ -78,7 +79,9 @@ export async function* upload(
debug('Building an upload list...');
const semaphore = new Sema(50, { capacity: 50 });
const agent = new Agent({ keepAlive: true });
const agent = apiUrl?.startsWith('https://')
? new https.Agent({ keepAlive: true })
: new http.Agent({ keepAlive: true });
shas.forEach((sha, index) => {
const uploadProgress = uploads[index];

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/fs-detectors",
"version": "3.4.0",
"version": "3.4.1",
"description": "Vercel filesystem detectors",
"main": "./dist/index.js",
"types": "./dist/index.d.ts",
@@ -23,6 +23,7 @@
"@vercel/routing-utils": "2.0.2",
"glob": "8.0.3",
"js-yaml": "4.1.0",
"json5": "2.2.1",
"minimatch": "3.0.4",
"semver": "6.1.1"
},

View File

@@ -1,6 +1,7 @@
import _path from 'path';
import yaml from 'js-yaml';
import glob from 'glob';
import json5 from 'json5';
import { DetectorFilesystem } from '../detectors/filesystem';
import { Workspace } from './get-workspaces';
import { getGlobFs } from './get-glob-fs';
@@ -144,7 +145,7 @@ async function getRushWorkspacePackagePaths({
}: GetPackagePathOptions): Promise<string[]> {
const rushWorkspaceAsBuffer = await fs.readFile('rush.json');
const { projects = [] } = JSON.parse(
const { projects = [] } = json5.parse(
rushWorkspaceAsBuffer.toString()
) as RushWorkspaces;

View File

@@ -2,7 +2,6 @@
"$schema": "https://developer.microsoft.com/json-schemas/rush/v5/rush.schema.json",
"rushVersion": "5.76.1",
"pnpmVersion": "6.7.1",
"pnpmOptions": {
@@ -22,6 +21,7 @@
"postRushBuild": []
},
// comment
"variants": [],
"projects": [
{

View File

@@ -12,7 +12,9 @@
"nodeSupportedVersionRange": ">=12.13.0 <13.0.0 || >=14.15.0 <15.0.0 || >=16.13.0 <17.0.0",
"gitPolicy": {},
/*
this is a comment
*/
"repository": {},
"eventHooks": {
"preRushInstall": [],

View File

@@ -46,7 +46,6 @@ interface Analyzed {
found?: boolean;
packageName: string;
functionName: string;
watch: string[];
}
interface PortInfo {
@@ -498,18 +497,8 @@ export async function build({
environment: {},
});
const watch = parsedAnalyzed.watch;
let watchSub: string[] = [];
// if `entrypoint` located in subdirectory
// we will need to concat it with return watch array
if (entrypointArr.length > 1) {
entrypointArr.pop();
watchSub = parsedAnalyzed.watch.map(file => join(...entrypointArr, file));
}
return {
output: lambda,
watch: watch.concat(watchSub),
};
} catch (error) {
debug('Go Builder Error: ' + error);

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/go",
"version": "2.2.8",
"version": "2.2.12",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/go",
@@ -35,7 +35,7 @@
"@types/jest": "28.1.6",
"@types/node-fetch": "^2.3.0",
"@types/tar": "^4.0.0",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"@vercel/ncc": "0.24.0",
"async-retry": "1.3.1",
"execa": "^1.0.0",

View File

@@ -0,0 +1,10 @@
package handler
import (
"fmt"
"net/http"
)
func Handler2(w http.ResponseWriter, r *http.Request) {
fmt.Fprintf(w, "from one.go")
}

View File

@@ -0,0 +1,10 @@
package handler
import (
"fmt"
"net/http"
)
func Handler(w http.ResponseWriter, r *http.Request) {
fmt.Fprintf(w, "from two.go")
}

View File

@@ -0,0 +1,3 @@
module handler
go 1.16

View File

@@ -0,0 +1,12 @@
{
"probes": [
{
"path": "/api/one",
"mustContain": "from one.go"
},
{
"path": "/api/two",
"mustContain": "from two.go"
}
]
}

View File

@@ -2,7 +2,6 @@ package main
import (
"encoding/json"
"flag"
"fmt"
"go/ast"
"go/parser"
@@ -104,84 +103,6 @@ func main() {
if err != nil {
log.Fatal(err)
}
se := string(rf)
var files []string
var relatedFiles []string
// Add entrypoint to watchlist
relFileName, err := filepath.Rel(filepath.Dir(fileName), fileName)
if err != nil {
log.Fatal(err)
}
relatedFiles = append(relatedFiles, relFileName)
// looking for all go files that have export func
// using in entrypoint
err = filepath.Walk(filepath.Dir(fileName), visit(&files))
if err != nil {
log.Fatal(err)
}
// looking related packages
var modPath string
flag.StringVar(&modPath, "modpath", "", "module path")
flag.Parse()
if len(modPath) > 1 {
err = filepath.Walk(modPath, visit(&files))
if err != nil {
log.Fatal(err)
}
}
for _, file := range files {
absFileName, _ := filepath.Abs(fileName)
absFile, _ := filepath.Abs(file)
// if it isn't entrypoint
if absFileName != absFile {
// find all export structs and functions
pf := parse(file)
var exportedDecl []string
ast.Inspect(pf, func(n ast.Node) bool {
switch t := n.(type) {
case *ast.FuncDecl:
if t.Name.IsExported() {
exportedDecl = append(exportedDecl, t.Name.Name)
}
// find variable declarations
case *ast.TypeSpec:
// which are public
if t.Name.IsExported() {
switch t.Type.(type) {
// and are interfaces
case *ast.StructType:
exportedDecl = append(exportedDecl, t.Name.Name)
}
}
}
return true
})
for _, ed := range exportedDecl {
if strings.Contains(se, ed) {
// find relative path of related file
var basePath string
if modPath == "" {
basePath = filepath.Dir(fileName)
} else {
basePath = modPath
}
rel, err := filepath.Rel(basePath, file)
if err != nil {
log.Fatal(err)
}
relatedFiles = append(relatedFiles, rel)
}
}
}
}
parsed := parse(fileName)
offset := parsed.Pos()
@@ -207,7 +128,6 @@ func main() {
analyzed := analyze{
PackageName: parsed.Name.Name,
FuncName: fn.Name.Name,
Watch: unique(relatedFiles),
}
analyzedJSON, _ := json.Marshal(analyzed)
fmt.Print(string(analyzedJSON))
@@ -229,7 +149,6 @@ func main() {
analyzed := analyze{
PackageName: parsed.Name.Name,
FuncName: fn.Name.Name,
Watch: unique(relatedFiles),
}
analyzedJSON, _ := json.Marshal(analyzed)
fmt.Print(string(analyzedJSON))

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/hydrogen",
"version": "0.0.21",
"version": "0.0.25",
"license": "MIT",
"main": "./dist/index.js",
"homepage": "https://vercel.com/docs",
@@ -21,7 +21,7 @@
"devDependencies": {
"@types/jest": "27.5.1",
"@types/node": "*",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"typescript": "4.6.4"
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/next",
"version": "3.1.29",
"version": "3.2.3",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/next-js",
@@ -44,7 +44,7 @@
"@types/semver": "6.0.0",
"@types/text-table": "0.2.1",
"@types/webpack-sources": "3.2.0",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"@vercel/nft": "0.22.1",
"@vercel/routing-utils": "2.0.2",
"async-sema": "3.0.1",

View File

@@ -633,7 +633,7 @@ export async function serverBuild({
const curPagesDir = isAppPath && appDir ? appDir : pagesDir;
const pageDir = path.dirname(path.join(curPagesDir, originalPagePath));
const normalizedBaseDir = `${baseDir}${
baseDir.endsWith('/') ? '' : '/'
baseDir.endsWith(path.sep) ? '' : path.sep
}`;
files.forEach((file: string) => {
const absolutePath = path.join(pageDir, file);
@@ -997,7 +997,7 @@ export async function serverBuild({
const isLastRoute = i === prerenderManifest.notFoundRoutes.length - 1;
if (prerenderManifest.staticRoutes[route]?.initialRevalidate === false) {
if (currentRouteSrc.length + route.length + 1 >= 4096) {
if (currentRouteSrc.length + route.length + 1 >= 4000) {
pushRoute(currentRouteSrc);
currentRouteSrc = starterRouteSrc;
}
@@ -1007,7 +1007,7 @@ export async function serverBuild({
currentRouteSrc.length - 1
)}${
currentRouteSrc[currentRouteSrc.length - 2] === '(' ? '' : '|'
}${route})`;
}${route}/?)`;
if (isLastRoute) {
pushRoute(currentRouteSrc);
@@ -1134,7 +1134,7 @@ export async function serverBuild({
if (appPathRoutesManifest) {
// create .rsc variant for app lambdas and edge functions
// to match prerenders so we can route the same when the
// __flight__ header is present
// __rsc__ header is present
const edgeFunctions = middleware.edgeFunctions;
for (let route of Object.values(appPathRoutesManifest)) {
@@ -1343,6 +1343,12 @@ export async function serverBuild({
.join('|')})?[/]?404/?`,
status: 404,
continue: true,
missing: [
{
type: 'header',
key: 'x-prerender-revalidate',
},
],
},
]
: [
@@ -1350,6 +1356,12 @@ export async function serverBuild({
src: path.posix.join('/', entryDirectory, '404/?'),
status: 404,
continue: true,
missing: [
{
type: 'header',
key: 'x-prerender-revalidate',
},
],
},
]),
@@ -1393,7 +1405,7 @@ export async function serverBuild({
has: [
{
type: 'header',
key: '__flight__',
key: '__rsc__',
},
],
dest: path.posix.join('/', entryDirectory, '/$1.rsc'),

View File

@@ -1698,7 +1698,6 @@ export const onPrerenderRoute =
const {
appDir,
pagesDir,
hasPages404,
static404Page,
entryDirectory,
prerenderManifest,
@@ -1896,11 +1895,12 @@ export const onPrerenderRoute =
});
}
// If revalidate isn't enabled we force the /404 route to be static
// to match next start behavior otherwise getStaticProps would be
// recalled for each 404 URL path since Prerender is cached based
// on the URL path
if (!canUsePreviewMode || (hasPages404 && routeKey === '/404')) {
// if preview mode/On-Demand ISR can't be leveraged
// we can output pure static outputs instead of prerenders
if (
!canUsePreviewMode ||
(routeKey === '/404' && !lambdas[outputPathPage])
) {
htmlFsRef.contentType = htmlContentType;
prerenders[outputPathPage] = htmlFsRef;
prerenders[outputPathData] = jsonFsRef;
@@ -2202,6 +2202,7 @@ interface BaseEdgeFunctionInfo {
page: string;
wasm?: { filePath: string; name: string }[];
assets?: { filePath: string; name: string }[];
regions?: 'auto' | string[] | 'all' | 'default';
}
interface EdgeFunctionInfoV1 extends BaseEdgeFunctionInfo {
@@ -2341,6 +2342,7 @@ export async function getMiddlewareBundle({
...wasmFiles,
...assetFiles,
},
regions: edgeFunction.regions,
entrypoint: 'index.js',
envVarsInUse: edgeFunction.env,
assets: (edgeFunction.assets ?? []).map(({ name }) => {

View File

@@ -15,7 +15,7 @@
"path": "/dashboard",
"status": 200,
"headers": {
"__flight__": "1"
"__rsc__": "1"
},
"mustContain": "M1:{",
"mustNotContain": "<html"

View File

@@ -1,8 +1,12 @@
/* eslint-env jest */
const path = require('path');
const { deployAndTest } = require('../../utils');
const ctx = {};
describe(`${__dirname.split(path.sep).pop()}`, () => {
it('should deploy and pass probe checks', async () => {
await deployAndTest(__dirname);
const info = await deployAndTest(__dirname);
Object.assign(ctx, info);
});
});

View File

@@ -0,0 +1,7 @@
{
"dependencies": {
"next": "9.5.5",
"react": "17.0.2",
"react-dom": "17.0.2"
}
}

View File

@@ -0,0 +1,11 @@
export default function Page() {
return <p>custom 404</p>;
}
export function getStaticProps() {
return {
props: {
is404: true,
},
};
}

View File

@@ -0,0 +1,3 @@
export default function handler(req, res) {
return res.json({ hello: 'world' });
}

View File

@@ -0,0 +1,3 @@
export default function Page() {
return <p>index page</p>;
}

View File

@@ -0,0 +1,14 @@
{
"probes": [
{
"path": "/",
"status": 200,
"mustContain": "index page"
},
{
"path": "/non-existent",
"status": 404,
"mustContain": "custom 404"
}
]
}

View File

@@ -1,7 +1,7 @@
/* eslint-env jest */
const path = require('path');
const cheerio = require('cheerio');
const { deployAndTest, check, waitFor } = require('../../utils');
const { deployAndTest, check } = require('../../utils');
const fetch = require('../../../../../test/lib/deployment/fetch-retry');
async function checkForChange(url, initialValue, getNewValue) {
@@ -141,4 +141,30 @@ describe(`${__dirname.split(path.sep).pop()}`, () => {
expect(preRevalidateRandom).toBeDefined();
expect(preRevalidateRandomData).toBeDefined();
});
it('should revalidate 404 page itself correctly', async () => {
const initial404 = await fetch(`${ctx.deploymentUrl}/404`);
const initial404Html = await initial404.text();
const initial404Props = JSON.parse(
cheerio.load(initial404Html)('#props').text()
);
expect(initial404.status).toBe(404);
expect(initial404Props.is404).toBe(true);
const revalidateRes = await fetch(
`${ctx.deploymentUrl}/api/revalidate?urlPath=/404`
);
expect(revalidateRes.status).toBe(200);
expect(await revalidateRes.json()).toEqual({ revalidated: true });
await check(async () => {
const res = await fetch(`${ctx.deploymentUrl}/404`);
const resHtml = await res.text();
const resProps = JSON.parse(cheerio.load(resHtml)('#props').text());
expect(res.status).toBe(404);
expect(resProps.is404).toBe(true);
expect(resProps.time).not.toEqual(initial404Props.time);
return 'success';
}, 'success');
});
});

View File

@@ -1,3 +1,18 @@
export default function Page() {
return <p>custom 404</p>;
export default function Page(props) {
return (
<>
<p>custom 404</p>
<p id="props">{JSON.stringify(props)}</p>
</>
);
}
export function getStaticProps() {
console.log('pages/404 getStaticProps');
return {
props: {
is404: true,
time: Date.now(),
},
};
}

View File

@@ -0,0 +1,21 @@
export default function Page(props) {
return (
<>
<p>/ssg/[slug]</p>
<p>{JSON.stringify(props)}</p>
</>
);
}
export function getStaticProps() {
return {
notFound: true,
};
}
export function getStaticPaths() {
return {
paths: ['/ssg/first', '/ssg/second'],
fallback: 'blocking',
};
}

View File

@@ -1,7 +1,20 @@
{
"version": 2,
"builds": [{ "src": "package.json", "use": "@vercel/next" }],
"probes": [
{
"path": "/ssg/first/",
"status": 404,
"mustContain": "This page could not be found"
},
{
"path": "/ssg/second/",
"status": 404,
"mustContain": "This page could not be found"
},
{
"path": "/ssg/third/",
"status": 404,
"mustContain": "This page could not be found"
},
{ "path": "/foo/", "status": 200, "mustContain": "foo page" },
{
"fetchOptions": { "redirect": "manual" },

View File

@@ -1,5 +0,0 @@
module.exports = {
generateBuildId() {
return 'testing-build-id';
},
};

View File

@@ -1,11 +0,0 @@
{
"engines": {
"node": "12.x"
},
"dependencies": {
"next": "canary",
"react": "^16.8.6",
"react-dom": "^16.8.6",
"firebase": "6.3.4"
}
}

View File

@@ -1,19 +0,0 @@
import firebase from 'firebase/app';
import 'firebase/firestore';
if (!firebase.apps.length) {
firebase.initializeApp({ projectId: 'noop' });
}
const store = firebase.firestore();
const Comp = ({ results }) => {
return <div>Hello Firebase: {results}</div>;
};
Comp.getInitialProps = async () => {
const query = await store.collection('users').get();
return { results: query.size };
};
export default Comp;

View File

@@ -1,19 +0,0 @@
import firebase from 'firebase/app';
import 'firebase/firestore';
if (!firebase.apps.length) {
firebase.initializeApp({ projectId: 'noop' });
}
const store = firebase.firestore();
const Comp = ({ results }) => {
return <div>Hello Firebase: {results}</div>;
};
Comp.getInitialProps = async () => {
const query = await store.collection('users').get();
return { results: query.size };
};
export default Comp;

View File

@@ -1,8 +0,0 @@
{
"version": 2,
"builds": [{ "src": "package.json", "use": "@vercel/next" }],
"probes": [
{ "path": "/nested/fb", "mustContain": "Hello Firebase: <!-- -->0" },
{ "path": "/nested/moar/fb", "mustContain": "Hello Firebase: <!-- -->0" }
]
}

View File

@@ -36,7 +36,10 @@ it('should build with app-dir correctly', async () => {
);
});
it('should build with app-dir in edg runtime correctly', async () => {
// TODO: re-enable after edge build failure is fixed in Next.js
// Disabled Oct, 1st 2022
// eslint-disable-next-line jest/no-disabled-tests
it.skip('should build with app-dir in edge runtime correctly', async () => {
const { buildResult } = await runBuildLambda(
path.join(__dirname, '../fixtures/00-app-dir-edge')
);
@@ -315,9 +318,9 @@ it('Should build the gip-gsp-404 example', async () => {
expect(routes[handleErrorIdx + 1].dest).toBe('/404');
expect(routes[handleErrorIdx + 1].headers).toBe(undefined);
expect(output['404']).toBeDefined();
expect(output['404'].type).toBe('FileFsRef');
expect(output['404'].type).toBe('Prerender');
expect(output['_next/data/testing-build-id/404.json']).toBeDefined();
expect(output['_next/data/testing-build-id/404.json'].type).toBe('FileFsRef');
expect(output['_next/data/testing-build-id/404.json'].type).toBe('Prerender');
const filePaths = Object.keys(output);
const serverlessError = filePaths.some(filePath => filePath.match(/_error/));
const hasUnderScoreAppStaticFile = filePaths.some(filePath =>

View File

@@ -0,0 +1,3 @@
{
"foo": "bar"
}

View File

@@ -1,7 +1,7 @@
const fs = require('fs-extra');
const ms = require('ms');
const path = require('path');
const { build } = require('../../../../src');
const { build } = require('../../../../dist');
const { FileFsRef } = require('@vercel/build-utils');
jest.setTimeout(ms('6m'));
@@ -12,6 +12,7 @@ describe(`${__dirname.split(path.sep).pop()}`, () => {
'index.test.js',
'next.config.js',
'package.json',
'data/strings.json',
'pages/foo/bar/index.js',
'pages/foo/index.js',
'pages/index.js',
@@ -38,9 +39,15 @@ describe(`${__dirname.split(path.sep).pop()}`, () => {
for (const page of pages) {
expect(output).toHaveProperty(page);
expect(path.resolve(output[page].fsPath)).toEqual(
path.join(pagesDir, `${page}.html`)
);
if (page === 'index') {
const { files, type } = output[page];
expect(type).toEqual('Lambda');
expect(files).toHaveProperty([path.join('data', 'strings.json')]);
} else {
expect(path.resolve(output[page].fsPath)).toEqual(
path.join(pagesDir, `${page}.html`)
);
}
}
for (const route of routes) {

View File

@@ -1,7 +1,18 @@
export default function Page() {
import fs from 'fs';
import path from 'path';
export default function Page({ foo }) {
return (
<>
<p>hello from pages</p>
<p>hello from pages {foo}</p>
</>
);
}
export async function getServerSideProps() {
const dataFile = path.join(process.cwd(), 'data', 'strings.json');
const strings = JSON.parse(fs.readFileSync(dataFile));
return {
props: strings,
};
}

View File

@@ -3,6 +3,13 @@ const fs = require('fs-extra');
const execa = require('execa');
const { join } = require('path');
async function copyToDist(sourcePath, outDir) {
return fs.copyFile(
join(__dirname, sourcePath),
join(outDir, 'edge-functions/edge-handler-template.js')
);
}
async function main() {
const srcDir = join(__dirname, 'src');
const outDir = join(__dirname, 'dist');
@@ -50,6 +57,8 @@ async function main() {
join(outDir, 'index.d.ts'),
join(__dirname, 'test/fixtures/15-helpers/ts/types.d.ts')
);
await copyToDist('src/edge-functions/edge-handler-template.js', outDir);
}
main().catch(err => {

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/node",
"version": "2.5.18",
"version": "2.5.22",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/node-js",
@@ -29,12 +29,12 @@
}
},
"dependencies": {
"@edge-runtime/vm": "1.1.0-beta.32",
"@edge-runtime/vm": "1.1.0-beta.36",
"@types/node": "*",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"@vercel/node-bridge": "3.0.0",
"@vercel/static-config": "2.0.3",
"edge-runtime": "1.1.0-beta.32",
"edge-runtime": "1.1.0-beta.37",
"esbuild": "0.14.47",
"exit-hook": "2.2.1",
"node-fetch": "2.6.7",

View File

@@ -70,38 +70,13 @@ if (!process.env.VERCEL_DEV_IS_ESM) {
}
import { createServer, Server, IncomingMessage, ServerResponse } from 'http';
import { Readable } from 'stream';
import type { Bridge } from '@vercel/node-bridge/bridge';
import { getVercelLauncher } from '@vercel/node-bridge/launcher.js';
import { VercelProxyResponse } from '@vercel/node-bridge/types';
import { Config, streamToBuffer, debug } from '@vercel/build-utils';
import exitHook from 'exit-hook';
import { EdgeRuntime, runServer } from 'edge-runtime';
import type { EdgeContext } from '@edge-runtime/vm';
import { Config } from '@vercel/build-utils';
import { getConfig } from '@vercel/static-config';
import { Project } from 'ts-morph';
import esbuild from 'esbuild';
import fetch from 'node-fetch';
import { createEdgeWasmPlugin, WasmAssets } from './edge-wasm-plugin';
const NODE_VERSION_MAJOR = process.version.match(/^v(\d+)\.\d+/)?.[1];
const NODE_VERSION_IDENTIFIER = `node${NODE_VERSION_MAJOR}`;
if (!NODE_VERSION_MAJOR) {
throw new Error(
`Unable to determine current node version: process.version=${process.version}`
);
}
function logError(error: Error) {
console.error(error.message);
if (error.stack) {
// only show the stack trace if debug is enabled
// because it points to internals, not user code
const errorPrefixLength = 'Error: '.length;
const errorMessageLength = errorPrefixLength + error.message.length;
debug(error.stack.substring(errorMessageLength + 1));
}
}
import { logError } from './utils';
import { createEdgeEventHandler } from './edge-functions/edge-handler';
import { createServerlessEventHandler } from './serverless-functions/serverless-handler';
function listen(server: Server, port: number, host: string): Promise<void> {
return new Promise(resolve => {
@@ -111,245 +86,6 @@ function listen(server: Server, port: number, host: string): Promise<void> {
});
}
async function createServerlessEventHandler(
entrypoint: string,
options: { shouldAddHelpers: boolean }
): Promise<(request: IncomingMessage) => Promise<VercelProxyResponse>> {
const launcher = getVercelLauncher({
entrypointPath: entrypoint,
helpersPath: './helpers.js',
shouldAddHelpers: options.shouldAddHelpers,
useRequire,
// not used
bridgePath: '',
sourcemapSupportPath: '',
});
const bridge: Bridge = launcher();
return async function (request: IncomingMessage) {
const body = await rawBody(request);
const event = {
Action: 'Invoke',
body: JSON.stringify({
method: request.method,
path: request.url,
headers: request.headers,
encoding: 'base64',
body: body.toString('base64'),
}),
};
return bridge.launcher(event, {
callbackWaitsForEmptyEventLoop: false,
});
};
}
async function serializeRequest(message: IncomingMessage) {
const bodyBuffer = await streamToBuffer(message);
const body = bodyBuffer.toString('base64');
return JSON.stringify({
url: message.url,
method: message.method,
headers: message.headers,
body,
});
}
async function compileUserCode(
entrypointPath: string,
entrypointLabel: string,
isMiddleware: boolean
): Promise<undefined | { userCode: string; wasmAssets: WasmAssets }> {
const { wasmAssets, plugin: edgeWasmPlugin } = createEdgeWasmPlugin();
try {
const result = await esbuild.build({
// bundling behavior: use globals (like "browser") instead
// of "require" statements for core libraries (like "node")
platform: 'browser',
// target syntax: only use syntax available on the current
// version of node
target: NODE_VERSION_IDENTIFIER,
sourcemap: 'inline',
bundle: true,
plugins: [edgeWasmPlugin],
entryPoints: [entrypointPath],
write: false, // operate in memory
format: 'cjs',
});
const compiledFile = result.outputFiles?.[0];
if (!compiledFile) {
throw new Error(
`Compilation of ${entrypointLabel} produced no output files.`
);
}
const userCode = `
${compiledFile.text};
const isMiddleware = ${isMiddleware};
addEventListener('fetch', async (event) => {
try {
let serializedRequest = await event.request.text();
let requestDetails = JSON.parse(serializedRequest);
let body;
if (requestDetails.method !== 'GET' && requestDetails.method !== 'HEAD') {
body = Uint8Array.from(atob(requestDetails.body), c => c.charCodeAt(0));
}
let requestUrl = requestDetails.headers['x-forwarded-proto'] + '://' + requestDetails.headers['x-forwarded-host'] + requestDetails.url;
let request = new Request(requestUrl, {
headers: requestDetails.headers,
method: requestDetails.method,
body: body
});
event.request = request;
let edgeHandler = module.exports.default;
if (!edgeHandler) {
throw new Error('No default export was found. Add a default export to handle requests. Learn more: https://vercel.link/creating-edge-middleware');
}
let response = await edgeHandler(event.request, event);
if (!response) {
if (isMiddleware) {
// allow empty responses to pass through
response = new Response(null, {
headers: {
'x-middleware-next': '1',
},
});
} else {
throw new Error('Edge Function "${entrypointLabel}" did not return a response.');
}
}
return event.respondWith(response);
} catch (error) {
// we can't easily show a meaningful stack trace
// so, stick to just the error message for now
const msg = error.cause
? (error.message + ': ' + (error.cause.message || error.cause))
: error.message;
event.respondWith(new Response(msg, {
status: 500,
headers: {
'x-vercel-failed': 'edge-wrapper'
}
}));
}
})`;
return { userCode, wasmAssets };
} catch (error) {
// We can't easily show a meaningful stack trace from ncc -> edge-runtime.
// So, stick with just the message for now.
console.error(`Failed to compile user code for edge runtime.`);
logError(error);
return undefined;
}
}
async function createEdgeRuntime(params?: {
userCode: string;
wasmAssets: WasmAssets;
}) {
try {
if (!params) {
return undefined;
}
const wasmBindings = await params.wasmAssets.getContext();
const edgeRuntime = new EdgeRuntime({
initialCode: params.userCode,
extend: (context: EdgeContext) => {
Object.assign(context, {
// This is required for esbuild wrapping logic to resolve
module: {},
// This is required for environment variable access.
// In production, env var access is provided by static analysis
// so that only the used values are available.
process: {
env: process.env,
},
// These are the global bindings for WebAssembly module
...wasmBindings,
});
return context;
},
});
const server = await runServer({ runtime: edgeRuntime });
exitHook(server.close);
return server;
} catch (error) {
// We can't easily show a meaningful stack trace from ncc -> edge-runtime.
// So, stick with just the message for now.
console.error('Failed to instantiate edge runtime.');
logError(error);
return undefined;
}
}
async function createEdgeEventHandler(
entrypointPath: string,
entrypointLabel: string,
isMiddleware: boolean
): Promise<(request: IncomingMessage) => Promise<VercelProxyResponse>> {
const userCode = await compileUserCode(
entrypointPath,
entrypointLabel,
isMiddleware
);
const server = await createEdgeRuntime(userCode);
return async function (request: IncomingMessage) {
if (!server) {
// this error state is already logged, but we have to wait until here to exit the process
// this matches the serverless function bridge launcher's behavior when
// an error is thrown in the function
process.exit(1);
}
const response = await fetch(server.url, {
redirect: 'manual',
method: 'post',
body: await serializeRequest(request),
});
const body = await response.text();
const isUserError =
response.headers.get('x-vercel-failed') === 'edge-wrapper';
if (isUserError && response.status >= 500) {
// this error was "unhandled" from the user code's perspective
console.log(`Unhandled rejection: ${body}`);
// this matches the serverless function bridge launcher's behavior when
// an error is thrown in the function
process.exit(1);
}
return {
statusCode: response.status,
headers: response.headers.raw(),
body,
encoding: 'utf8',
};
};
}
const validRuntimes = ['experimental-edge'];
function parseRuntime(
entrypoint: string,
@@ -388,7 +124,10 @@ async function createEventHandler(
);
}
return createServerlessEventHandler(entrypointPath, options);
return createServerlessEventHandler(entrypointPath, {
shouldAddHelpers: options.shouldAddHelpers,
useRequire,
});
}
let handleEvent: (request: IncomingMessage) => Promise<VercelProxyResponse>;
@@ -425,21 +164,6 @@ async function main() {
}
}
export function rawBody(readable: Readable): Promise<Buffer> {
return new Promise((resolve, reject) => {
let bytes = 0;
const chunks: Buffer[] = [];
readable.on('error', reject);
readable.on('data', chunk => {
chunks.push(chunk);
bytes += chunk.length;
});
readable.on('end', () => {
resolve(Buffer.concat(chunks, bytes));
});
});
}
export async function onDevRequest(
req: IncomingMessage,
res: ServerResponse

View File

@@ -0,0 +1,73 @@
// provided by the edge runtime:
/* global addEventListener Request Response atob */
// provided by our edge handler logic:
/* global IS_MIDDLEWARE ENTRYPOINT_LABEL */
function buildUrl(requestDetails) {
let proto = requestDetails.headers['x-forwarded-proto'];
let host = requestDetails.headers['x-forwarded-host'];
let path = requestDetails.url;
return `${proto}://${host}${path}`;
}
addEventListener('fetch', async event => {
try {
let serializedRequest = await event.request.text();
let requestDetails = JSON.parse(serializedRequest);
let body;
if (requestDetails.method !== 'GET' && requestDetails.method !== 'HEAD') {
body = Uint8Array.from(atob(requestDetails.body), c => c.charCodeAt(0));
}
let request = new Request(buildUrl(requestDetails), {
headers: requestDetails.headers,
method: requestDetails.method,
body: body,
});
event.request = request;
let edgeHandler = module.exports.default;
if (!edgeHandler) {
throw new Error(
'No default export was found. Add a default export to handle requests. Learn more: https://vercel.link/creating-edge-middleware'
);
}
let response = await edgeHandler(event.request, event);
if (!response) {
if (IS_MIDDLEWARE) {
// allow empty responses to pass through
response = new Response(null, {
headers: {
'x-middleware-next': '1',
},
});
} else {
throw new Error(
`Edge Function "${ENTRYPOINT_LABEL}" did not return a response.`
);
}
}
return event.respondWith(response);
} catch (error) {
// we can't easily show a meaningful stack trace
// so, stick to just the error message for now
const msg = error.cause
? error.message + ': ' + (error.cause.message || error.cause)
: error.message;
event.respondWith(
new Response(msg, {
status: 500,
headers: {
'x-vercel-failed': 'edge-wrapper',
},
})
);
}
});

View File

@@ -0,0 +1,182 @@
import { IncomingMessage } from 'http';
import { VercelProxyResponse } from '@vercel/node-bridge/types';
import { streamToBuffer } from '@vercel/build-utils';
import exitHook from 'exit-hook';
import { EdgeRuntime, runServer } from 'edge-runtime';
import type { EdgeContext } from '@edge-runtime/vm';
import esbuild from 'esbuild';
import fetch from 'node-fetch';
import { createEdgeWasmPlugin, WasmAssets } from './edge-wasm-plugin';
import { logError } from '../utils';
import { readFileSync } from 'fs';
const NODE_VERSION_MAJOR = process.version.match(/^v(\d+)\.\d+/)?.[1];
const NODE_VERSION_IDENTIFIER = `node${NODE_VERSION_MAJOR}`;
if (!NODE_VERSION_MAJOR) {
throw new Error(
`Unable to determine current node version: process.version=${process.version}`
);
}
const edgeHandlerTemplate = readFileSync(
`${__dirname}/edge-handler-template.js`
);
async function serializeRequest(message: IncomingMessage) {
const bodyBuffer = await streamToBuffer(message);
const body = bodyBuffer.toString('base64');
return JSON.stringify({
url: message.url,
method: message.method,
headers: message.headers,
body,
});
}
async function compileUserCode(
entrypointPath: string,
entrypointLabel: string,
isMiddleware: boolean
): Promise<undefined | { userCode: string; wasmAssets: WasmAssets }> {
const { wasmAssets, plugin: edgeWasmPlugin } = createEdgeWasmPlugin();
try {
const result = await esbuild.build({
// bundling behavior: use globals (like "browser") instead
// of "require" statements for core libraries (like "node")
platform: 'browser',
// target syntax: only use syntax available on the current
// version of node
target: NODE_VERSION_IDENTIFIER,
sourcemap: 'inline',
legalComments: 'none',
bundle: true,
plugins: [edgeWasmPlugin],
entryPoints: [entrypointPath],
write: false, // operate in memory
format: 'cjs',
});
const compiledFile = result.outputFiles?.[0];
if (!compiledFile) {
throw new Error(
`Compilation of ${entrypointLabel} produced no output files.`
);
}
const userCode = `
// strict mode
"use strict";var regeneratorRuntime;
// user code
${compiledFile.text};
// request metadata
const IS_MIDDLEWARE = ${isMiddleware};
const ENTRYPOINT_LABEL = '${entrypointLabel}';
// edge handler
${edgeHandlerTemplate}
`;
return { userCode, wasmAssets };
} catch (error) {
// We can't easily show a meaningful stack trace from ncc -> edge-runtime.
// So, stick with just the message for now.
console.error(`Failed to compile user code for edge runtime.`);
logError(error);
return undefined;
}
}
async function createEdgeRuntime(params?: {
userCode: string;
wasmAssets: WasmAssets;
}) {
try {
if (!params) {
return undefined;
}
const wasmBindings = await params.wasmAssets.getContext();
const edgeRuntime = new EdgeRuntime({
initialCode: params.userCode,
extend: (context: EdgeContext) => {
Object.assign(context, {
// This is required for esbuild wrapping logic to resolve
module: {},
// This is required for environment variable access.
// In production, env var access is provided by static analysis
// so that only the used values are available.
process: {
env: process.env,
},
// These are the global bindings for WebAssembly module
...wasmBindings,
});
return context;
},
});
const server = await runServer({ runtime: edgeRuntime });
exitHook(server.close);
return server;
} catch (error) {
// We can't easily show a meaningful stack trace from ncc -> edge-runtime.
// So, stick with just the message for now.
console.error('Failed to instantiate edge runtime.');
logError(error);
return undefined;
}
}
export async function createEdgeEventHandler(
entrypointPath: string,
entrypointLabel: string,
isMiddleware: boolean
): Promise<(request: IncomingMessage) => Promise<VercelProxyResponse>> {
const userCode = await compileUserCode(
entrypointPath,
entrypointLabel,
isMiddleware
);
const server = await createEdgeRuntime(userCode);
return async function (request: IncomingMessage) {
if (!server) {
// this error state is already logged, but we have to wait until here to exit the process
// this matches the serverless function bridge launcher's behavior when
// an error is thrown in the function
process.exit(1);
}
const response = await fetch(server.url, {
redirect: 'manual',
method: 'post',
body: await serializeRequest(request),
});
const body = await response.text();
const isUserError =
response.headers.get('x-vercel-failed') === 'edge-wrapper';
if (isUserError && response.status >= 500) {
// this error was "unhandled" from the user code's perspective
console.log(`Unhandled rejection: ${body}`);
// this matches the serverless function bridge launcher's behavior when
// an error is thrown in the function
process.exit(1);
}
return {
statusCode: response.status,
headers: response.headers.raw(),
body,
encoding: 'utf8',
};
};
}

View File

@@ -0,0 +1,58 @@
import { IncomingMessage } from 'http';
import { Readable } from 'stream';
import type { Bridge } from '@vercel/node-bridge/bridge';
import { getVercelLauncher } from '@vercel/node-bridge/launcher.js';
import { VercelProxyResponse } from '@vercel/node-bridge/types';
function rawBody(readable: Readable): Promise<Buffer> {
return new Promise((resolve, reject) => {
let bytes = 0;
const chunks: Buffer[] = [];
readable.on('error', reject);
readable.on('data', chunk => {
chunks.push(chunk);
bytes += chunk.length;
});
readable.on('end', () => {
resolve(Buffer.concat(chunks, bytes));
});
});
}
export async function createServerlessEventHandler(
entrypoint: string,
options: {
shouldAddHelpers: boolean;
useRequire: boolean;
}
): Promise<(request: IncomingMessage) => Promise<VercelProxyResponse>> {
const launcher = getVercelLauncher({
entrypointPath: entrypoint,
helpersPath: './helpers.js',
shouldAddHelpers: options.shouldAddHelpers,
useRequire: options.useRequire,
// not used
bridgePath: '',
sourcemapSupportPath: '',
});
const bridge: Bridge = launcher();
return async function (request: IncomingMessage) {
const body = await rawBody(request);
const event = {
Action: 'Invoke',
body: JSON.stringify({
method: request.method,
path: request.url,
headers: request.headers,
encoding: 'base64',
body: body.toString('base64'),
}),
};
return bridge.launcher(event, {
callbackWaitsForEmptyEventLoop: false,
});
};
}

View File

@@ -1,5 +1,6 @@
import { extname } from 'path';
import { pathToRegexp } from 'path-to-regexp';
import { debug } from '@vercel/build-utils';
export function getRegExpFromMatchers(matcherOrMatchers: unknown): string {
if (!matcherOrMatchers) {
@@ -55,3 +56,14 @@ export function entrypointToOutputPath(
}
return entrypoint;
}
export function logError(error: Error) {
console.error(error.message);
if (error.stack) {
// only show the stack trace if debug is enabled
// because it points to internals, not user code
const errorPrefixLength = 'Error: '.length;
const errorMessageLength = errorPrefixLength + error.message.length;
debug(error.stack.substring(errorMessageLength + 1));
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/python",
"version": "3.1.17",
"version": "3.1.21",
"main": "./dist/index.js",
"license": "MIT",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/python",
@@ -22,7 +22,7 @@
"devDependencies": {
"@types/execa": "^0.9.0",
"@types/jest": "27.4.1",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"@vercel/ncc": "0.24.0",
"execa": "^1.0.0",
"typescript": "4.3.4"

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/redwood",
"version": "1.0.26",
"version": "1.0.30",
"main": "./dist/index.js",
"license": "MIT",
"homepage": "https://vercel.com/docs",
@@ -27,6 +27,6 @@
"@types/aws-lambda": "8.10.19",
"@types/node": "*",
"@types/semver": "6.0.0",
"@vercel/build-utils": "5.5.0"
"@vercel/build-utils": "5.5.4"
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/remix",
"version": "1.0.27",
"version": "1.0.31",
"license": "MIT",
"main": "./dist/index.js",
"homepage": "https://vercel.com/docs",
@@ -25,7 +25,7 @@
"devDependencies": {
"@types/jest": "27.5.1",
"@types/node": "*",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"typescript": "4.6.4"
}
}

View File

@@ -1,7 +1,7 @@
{
"name": "@vercel/ruby",
"author": "Nathan Cahill <nathan@nathancahill.com>",
"version": "1.3.34",
"version": "1.3.38",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/ruby",
@@ -22,7 +22,7 @@
"devDependencies": {
"@types/fs-extra": "8.0.0",
"@types/semver": "6.0.0",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"@vercel/ncc": "0.24.0",
"execa": "2.0.4",
"fs-extra": "^7.0.1",

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/static-build",
"version": "1.0.26",
"version": "1.0.30",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/build-step",
@@ -36,7 +36,7 @@
"@types/ms": "0.7.31",
"@types/node-fetch": "2.5.4",
"@types/promise-timeout": "1.3.0",
"@vercel/build-utils": "5.5.0",
"@vercel/build-utils": "5.5.4",
"@vercel/frameworks": "1.1.6",
"@vercel/ncc": "0.24.0",
"@vercel/routing-utils": "2.0.2",

View File

@@ -3,13 +3,13 @@
"builds": [
{ "src": "some-build.sh", "use": "@vercel/static-build" },
{ "src": "node14sh/build.sh", "use": "@vercel/static-build" },
{ "src": "node12sh/build.sh", "use": "@vercel/static-build" },
{ "src": "node16sh/build.sh", "use": "@vercel/static-build" },
{ "src": "subdirectory/some-build.sh", "use": "@vercel/static-build" }
],
"probes": [
{ "path": "/", "mustContain": "cow:RANDOMNESS_PLACEHOLDER" },
{ "path": "/node14sh/", "mustContain": "node:v14" },
{ "path": "/node12sh/", "mustContain": "node:v12" },
{ "path": "/node16sh/", "mustContain": "node:v16" },
{ "path": "/subdirectory/", "mustContain": "yoda:RANDOMNESS_PLACEHOLDER" }
]
}

View File

@@ -10,7 +10,7 @@
"e2e": "ng e2e"
},
"engines": {
"node": "12.x"
"node": "14.x"
},
"private": true,
"dependencies": {

View File

@@ -719,10 +719,10 @@
dependencies:
"@jridgewell/trace-mapping" "0.3.9"
"@edge-runtime/format@^1.1.0-beta.32":
version "1.1.0-beta.32"
resolved "https://registry.yarnpkg.com/@edge-runtime/format/-/format-1.1.0-beta.32.tgz#6f0d5a8726cc54ebb2b1dcb86fe25c0ff093731d"
integrity sha512-wpQtbgHJuSF1fvDV6Gxg2L7uNpzhQnRl91DREgOdRfa3S8y9AANjPi1/g3GPBGPiGZoX2Fv+MKV6RgY5/jLfJA==
"@edge-runtime/format@1.1.0-beta.33":
version "1.1.0-beta.33"
resolved "https://registry.yarnpkg.com/@edge-runtime/format/-/format-1.1.0-beta.33.tgz#bfba333b47167b0deb09b3df0d5e60d8c72d7c62"
integrity sha512-t34oTdZOqYSiguCGnt9GYzh9mrnhCHNRPGDvxt5PB5T3LZpSVk+vfSXRqpvTxy51sxQpxvTZry8QLC+E+Fm67w==
"@edge-runtime/jest-environment@1.1.0-beta.7":
version "1.1.0-beta.7"
@@ -736,22 +736,22 @@
jest-mock "28.1.1"
jest-util "28.1.1"
"@edge-runtime/primitives@^1.1.0-beta.32":
version "1.1.0-beta.33"
resolved "https://registry.yarnpkg.com/@edge-runtime/primitives/-/primitives-1.1.0-beta.33.tgz#9bd0d866addfdc98ec32ff3ca0f6d24f412a5c03"
integrity sha512-mAZw/YRhwkaPVYwSwOTJTMMzZxfuLze6VEepsrVO/4yjnxriOf2GREgLal6OBtTcEEC44q4lqS+OSd0QaSFZEQ==
"@edge-runtime/primitives@1.1.0-beta.36":
version "1.1.0-beta.36"
resolved "https://registry.yarnpkg.com/@edge-runtime/primitives/-/primitives-1.1.0-beta.36.tgz#f6e8d7971d515d03fcb2ec59e8cb65564fe80440"
integrity sha512-Tji7SGWmn1+JGSnzFtWUoS7+kODIFprTyIAw0EBOVWEQKWfs7r0aTEm1XkJR0+d1jP9f0GB5LBKG/Z7KFyhx7g==
"@edge-runtime/primitives@^1.1.0-beta.7":
version "1.1.0-beta.7"
resolved "https://registry.yarnpkg.com/@edge-runtime/primitives/-/primitives-1.1.0-beta.7.tgz#0450ee3e5e03a8898ee072c0d0ee01fd2c1ed8f1"
integrity sha512-ZwuSMpmrf2mAj/O7EWxKOXrC03YMkU64N+CgvVFOtJGfhydk4Db/392Zama3BjNYAMOr/oY9L7HxfPutAFesKw==
"@edge-runtime/vm@1.1.0-beta.32", "@edge-runtime/vm@^1.1.0-beta.32":
version "1.1.0-beta.32"
resolved "https://registry.yarnpkg.com/@edge-runtime/vm/-/vm-1.1.0-beta.32.tgz#1bc9c77a88343478d50009f30813b9fbf8a0f4ad"
integrity sha512-G0SH80am2XjlK6oFI3RoKlg1SBS5ZAeqakYasfNhJEXqM7g7tsoh5jURMQcNxpLvo48XBTgHgAVEMzhAGgDPZg==
"@edge-runtime/vm@1.1.0-beta.36":
version "1.1.0-beta.36"
resolved "https://registry.yarnpkg.com/@edge-runtime/vm/-/vm-1.1.0-beta.36.tgz#c5bd6d3823ec252f15fb97c5f65e94084100800c"
integrity sha512-uPZmL7X+lKBFJsTg8nC0qPDBx4JGgpRqlgJi2s77g2NOtqitQOI90BfIKHZSSoMQEwTqfvAkpu2ui8nazOwHxA==
dependencies:
"@edge-runtime/primitives" "^1.1.0-beta.32"
"@edge-runtime/primitives" "1.1.0-beta.36"
"@edge-runtime/vm@^1.1.0-beta.7":
version "1.1.0-beta.7"
@@ -5473,15 +5473,15 @@ ecc-jsbn@~0.1.1:
jsbn "~0.1.0"
safer-buffer "^2.1.0"
edge-runtime@1.1.0-beta.32:
version "1.1.0-beta.32"
resolved "https://registry.yarnpkg.com/edge-runtime/-/edge-runtime-1.1.0-beta.32.tgz#e43fd53c57fdba3c567b3fef50743cba00ff5e49"
integrity sha512-fbqqUF3OKynqtWgExhjyxXX2SwbkWuwmjUYhml3Sv8Y/vkrTxyTKrxS0MoxUy5sGPB3BBEtpopn36cQgwlOpAg==
edge-runtime@1.1.0-beta.37:
version "1.1.0-beta.37"
resolved "https://registry.yarnpkg.com/edge-runtime/-/edge-runtime-1.1.0-beta.37.tgz#e88eafba4f3630990468bb53efca6f48d76063c4"
integrity sha512-IP0xYNmp0XXoXVnrAf/e67224ZkMUUBMyUUohVxWWI5XdyetIGRNWp3GifDy3LpbuE02yv42rgtoE+tm+whcLA==
dependencies:
"@edge-runtime/format" "^1.1.0-beta.32"
"@edge-runtime/vm" "^1.1.0-beta.32"
"@edge-runtime/format" "1.1.0-beta.33"
"@edge-runtime/vm" "1.1.0-beta.36"
exit-hook "2.2.1"
http-status "1.5.2"
http-status "1.5.3"
mri "1.2.0"
picocolors "1.0.0"
pretty-bytes "5.6.0"
@@ -7176,10 +7176,10 @@ http-signature@~1.2.0:
jsprim "^1.2.2"
sshpk "^1.7.0"
http-status@1.5.2:
version "1.5.2"
resolved "https://registry.yarnpkg.com/http-status/-/http-status-1.5.2.tgz#22e6ea67b4b5e2a366f49cb1759dc5479cad2fd6"
integrity sha512-HzxX+/hV/8US1Gq4V6R6PgUmJ5Pt/DGATs4QhdEOpG8LrdS9/3UG2nnOvkqUpRks04yjVtV5p/NODjO+wvf6vg==
http-status@1.5.3:
version "1.5.3"
resolved "https://registry.yarnpkg.com/http-status/-/http-status-1.5.3.tgz#9d1f6adcd1a609f535679f6e1b82811b96c3306e"
integrity sha512-jCClqdnnwigYslmtfb28vPplOgoiZ0siP2Z8C5Ua+3UKbx410v+c+jT+jh1bbI4TvcEySuX0vd/CfFZFbDkJeQ==
https-proxy-agent@2.2.1:
version "2.2.1"
@@ -8480,6 +8480,11 @@ json5@2.1.1:
dependencies:
minimist "^1.2.0"
json5@2.2.1, json5@^2.2.1:
version "2.2.1"
resolved "https://registry.yarnpkg.com/json5/-/json5-2.2.1.tgz#655d50ed1e6f95ad1a3caababd2b0efda10b395c"
integrity sha512-1hqLFMSrGHRHxav9q9gNjJ5EXznIxGVO09xQRrwplcS8qs28pZ8s8hupZAmqDwZUmVZ2Qb2jnyPOWcDH8m8dlA==
json5@^2.1.0, json5@^2.1.2:
version "2.1.3"
resolved "https://registry.yarnpkg.com/json5/-/json5-2.1.3.tgz#c9b0f7fa9233bfe5807fe66fcf3a5617ed597d43"
@@ -8487,11 +8492,6 @@ json5@^2.1.0, json5@^2.1.2:
dependencies:
minimist "^1.2.5"
json5@^2.2.1:
version "2.2.1"
resolved "https://registry.yarnpkg.com/json5/-/json5-2.2.1.tgz#655d50ed1e6f95ad1a3caababd2b0efda10b395c"
integrity sha512-1hqLFMSrGHRHxav9q9gNjJ5EXznIxGVO09xQRrwplcS8qs28pZ8s8hupZAmqDwZUmVZ2Qb2jnyPOWcDH8m8dlA==
jsonc-parser@^3.0.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/jsonc-parser/-/jsonc-parser-3.1.0.tgz#73b8f0e5c940b83d03476bc2e51a20ef0932615d"