Compare commits

..

13 Commits

Author SHA1 Message Date
Vercel Release Bot
6f3ae1a0ed Version Packages (#10836)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2023-11-15 23:19:35 -06:00
Nathan Rajlich
88da7463ce [build-utils] Remove Node v20 env var check (#10834)
It's safe to remove this env var check, because #10822 makes it explicit on which versions of Node are actually present in the build environment.
2023-11-15 22:04:36 +00:00
Ethan Arrowood
142aa55a5d Add .node-version file (#10820)
This a small QOL change. Essentially if you have a tool like `fnm` on your machine for using multiple versions of Node.js, they will look for files like this one and automatically switch your terminal session node version to the value in it. 

The `vercel/front` repo uses this too
2023-11-15 20:12:05 +00:00
Vercel Release Bot
7bc8b65d13 Version Packages (#10832)
This PR was opened by the [Changesets release](https://github.com/changesets/action) GitHub action. When you're ready to do a release, you can merge this and the packages will be published to npm automatically. If you're not ready to do a release yet, that's fine, whenever you add more changesets to main, this PR will be updated.


# Releases
## @vercel/build-utils@7.2.4

### Patch Changes

-   Select Node.js version based on what's available in build-container ([#10822](https://github.com/vercel/vercel/pull/10822))

## vercel@32.5.4

### Patch Changes

-   Updated dependencies \[[`65dec5b7e`](65dec5b7e7)]:
    -   @vercel/build-utils@7.2.4
    -   @vercel/node@3.0.10
    -   @vercel/static-build@2.0.11

## @vercel/client@13.0.8

### Patch Changes

-   Updated dependencies \[[`65dec5b7e`](65dec5b7e7)]:
    -   @vercel/build-utils@7.2.4

## @vercel/gatsby-plugin-vercel-builder@2.0.10

### Patch Changes

-   Updated dependencies \[[`65dec5b7e`](65dec5b7e7)]:
    -   @vercel/build-utils@7.2.4

## @vercel/node@3.0.10

### Patch Changes

-   Updated dependencies \[[`65dec5b7e`](65dec5b7e7)]:
    -   @vercel/build-utils@7.2.4

## @vercel/static-build@2.0.11

### Patch Changes

-   Updated dependencies \[]:
    -   @vercel/gatsby-plugin-vercel-builder@2.0.10

## @vercel-internals/types@1.0.15

### Patch Changes

-   Updated dependencies \[[`65dec5b7e`](65dec5b7e7)]:
    -   @vercel/build-utils@7.2.4
2023-11-15 00:45:19 +00:00
Nathan Rajlich
65dec5b7e7 [build-utils] Select Node.js version based on what's available in build-container (#10822)
Makes `getLatestNodeVersion()` and `getSupportedNodeVersion()` (and thus by extension - `getNodeVersion()`) be aware of the available Node.js versions when running inside the build-container. This is to address the situation with the new AL2023 build-container image which has different Node versions installed compared to the existing AL2 build image.

### Project Settings `20.x` with package.json `"engines": "18.x"`

If the Project Settings Node Version is set to `20.x`, but the package.json has `"engines": "18.x"`, then the build will fail like so, because Node 18 is not present on the AL2023 build image:

<img width="1044" alt="Screenshot 2023-11-09 at 1 25 41 PM" src="https://github.com/vercel/vercel/assets/71256/572c544b-6574-4eb1-98f7-787075a60000">

### Project Settings `18.x` with package.json `"engines": "20.x"`

If the Project Settings Node Version is set to `18.x`, but the package.json has `"engines": "20.x"`, then the build will fail like so, because Node 20 is not present on the AL2 build image:

<img width="1042" alt="Screenshot 2023-11-09 at 1 34 43 PM" src="https://github.com/vercel/vercel/assets/71256/c6a2d955-9453-4ef5-a99d-b49a6d9af766">

### Project Settings `18.x` with no package.json `"engines"`

If Project Settings Node Version is set to `18.x`, but the package.json has no "engines" (and thus wants "latest"), then the latest available Node version in the build-container, which would be Node 18.
2023-11-14 23:08:27 +00:00
Brooks Lybrand
f3b62d8ea2 chore: Update Remix example README (#10830)
Not sure the best spot to put this command in the Readme, but I thought having a `create-remix` command might be useful for people trying to get going.

If there's a more preferred way to scaffold this project on your machine, I'm open to that, just using what I know
2023-11-13 20:10:57 +00:00
Wyatt Johnson
9472c22bf7 [next] Added more PPR tests (#10823)
This adds additional regression tests for PPR support for Next.js.
2023-11-09 23:45:35 +00:00
Vercel Release Bot
493185709a Version Packages (#10809)
This PR was opened by the [Changesets
release](https://github.com/changesets/action) GitHub action. When
you're ready to do a release, you can merge this and the packages will
be published to npm automatically. If you're not ready to do a release
yet, that's fine, whenever you add more changesets to main, this PR will
be updated.


# Releases
## vercel@32.5.3

### Patch Changes

- Handle `TooManyProjects` error in places where projects are created
([#10807](https://github.com/vercel/vercel/pull/10807))

- Updated dependencies
\[[`89c1e0323`](89c1e03233),
[`fd29b966d`](fd29b966d3)]:
    -   @vercel/node@3.0.9
    -   @vercel/next@4.0.14

## @vercel/next@4.0.14

### Patch Changes

- Fixed headers for static routes when PPR is enabled
([#10808](https://github.com/vercel/vercel/pull/10808))

## @vercel/node@3.0.9

### Patch Changes

- Replace usage of `fetch` with `undici.request`
([#10767](https://github.com/vercel/vercel/pull/10767))

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2023-11-08 10:36:37 -07:00
Ethan Arrowood
89c1e03233 [node] swap undici.fetch for undici.request in serverless-handler.mts (#10767)
In a recent undici update, setting the `host` header for fetch requests became invalid (https://github.com/nodejs/undici/pull/2322). 

We relied on this in order to proxy serverless dev server requests via `@vercel/node`. 

This PR replaces the usage of `undici.fetch` with `undici.request`. 

It is blocked by an `undici` type change: https://github.com/nodejs/undici/pull/2373
2023-11-08 17:31:03 +00:00
Zach Ward
ebd7e3ac39 Project limits error handling (#10807)
This PR adds improved error handling for the 200 project limit error
that will start being returned for free tier teams/accounts. The
following changes have been made:
- improve error message format by using `client.output.prettyError` so
that the docs link
(https://vercel.com/docs/limits/overview#general-limits) returned with
the error response is included with the error message
- add explicit error handling of this error from any places where
`createProject` is called, which includes the following commands:
  - `vc project add`
  - `vc link` (indirectly called via `ensureLink`)
  - `vc list` (indirectly called via `ensureLink`)
  - `vc git connect` (indirectly called via `ensureLink`)

### Testing
- sign in to a vercel account that is associated with your work email
(ends in `@vercel.com`), this is necessary for creating a team with the
proper conditions to artificially trigger the error message
- create a Pro Trial team and make sure to prefix the name with:
`vtest314 too many projects `, for example `vtest314 too many projects
test 1`
- check out this branch and cd to `vercel/vercel/packages/cli`
- run: `pnpm dev add [project-name] --cwd=/path/to/some/project`
- the project should fail to be created and you should see the expected
error message (screenshot below) in the terminal output

**Screenshot of error message when attempting to add project from cli**
<img width="798" alt="image"
src="https://github.com/vercel/vercel/assets/14896430/43e6ac2c-ae1c-4367-8d57-0aeb7fbddf33">

---------

Co-authored-by: Nathan Rajlich <n@n8.io>
2023-11-08 11:48:12 -05:00
Wyatt Johnson
fd29b966d3 tests: added tests for PPR (#10808)
This adds some tests to the PPR implementation for Next.js. This also
fixes a bug where the static pages were incorrectly generating a header
that falsly indicated that it postponed.
2023-11-08 09:21:51 -07:00
Vercel Release Bot
2bd9216403 Version Packages (#10805)
This PR was opened by the [Changesets
release](https://github.com/changesets/action) GitHub action. When
you're ready to do a release, you can merge this and the packages will
be published to npm automatically. If you're not ready to do a release
yet, that's fine, whenever you add more changesets to main, this PR will
be updated.


# Releases
## vercel@32.5.2

### Patch Changes

- Updated dependencies
\[[`c94a082f6`](c94a082f6b)]:
    -   @vercel/next@4.0.13

## @vercel/next@4.0.13

### Patch Changes

- Added `getRequestHandlerWithMetadata` export
([#10753](https://github.com/vercel/vercel/pull/10753))

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2023-11-07 12:20:58 -07:00
Wyatt Johnson
c94a082f6b Added getRequestHandlerWithMetadata export (#10753)
This adds a new `getRequestHandlerWithMetadata` export if enabled and
available to the exported method.

---------

Co-authored-by: Joe Haddad <timer@vercel.com>
Co-authored-by: JJ Kasper <jj@jjsweb.site>
2023-11-07 10:32:03 -08:00
184 changed files with 1962 additions and 268 deletions

1
.node_version Normal file
View File

@@ -0,0 +1 @@
v16.20.2

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
16.20.2

View File

@@ -2,6 +2,12 @@
This directory is a brief example of a [Remix](https://remix.run/docs) site that can be deployed to Vercel with zero configuration.
To get started, run the Remix cli with this template
```sh
npx create-remix@latest --template vercel/vercel/examples/remix
```
## Deploy Your Own
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https://github.com/vercel/vercel/tree/main/examples/remix&template=remix)

View File

@@ -1,5 +1,19 @@
# @vercel-internals/types
## 1.0.16
### Patch Changes
- Updated dependencies [[`88da7463c`](https://github.com/vercel/vercel/commit/88da7463ce12df91d49fbde85cb617030d55f558)]:
- @vercel/build-utils@7.2.5
## 1.0.15
### Patch Changes
- Updated dependencies [[`65dec5b7e`](https://github.com/vercel/vercel/commit/65dec5b7e752f4da8fe0ffdb25215170453f6f8b)]:
- @vercel/build-utils@7.2.4
## 1.0.14
### Patch Changes

View File

@@ -428,7 +428,8 @@ export type ProjectLinkedError = {
| 'TEAM_DELETED'
| 'PATH_IS_FILE'
| 'INVALID_ROOT_DIRECTORY'
| 'MISSING_PROJECT_SETTINGS';
| 'MISSING_PROJECT_SETTINGS'
| 'TOO_MANY_PROJECTS';
};
export type ProjectLinkResult =

View File

@@ -1,7 +1,7 @@
{
"private": true,
"name": "@vercel-internals/types",
"version": "1.0.14",
"version": "1.0.16",
"types": "index.d.ts",
"main": "index.d.ts",
"files": [
@@ -10,7 +10,7 @@
"dependencies": {
"@types/node": "14.14.31",
"@vercel-internals/constants": "1.0.4",
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"@vercel/routing-utils": "3.1.0"
},
"devDependencies": {

View File

@@ -1,5 +1,17 @@
# @vercel/build-utils
## 7.2.5
### Patch Changes
- Remove Node.js v20 env var check ([#10834](https://github.com/vercel/vercel/pull/10834))
## 7.2.4
### Patch Changes
- Select Node.js version based on what's available in build-container ([#10822](https://github.com/vercel/vercel/pull/10822))
## 7.2.3
### Patch Changes

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/build-utils",
"version": "7.2.3",
"version": "7.2.5",
"license": "Apache-2.0",
"main": "./dist/index.js",
"types": "./dist/index.d.js",

View File

@@ -1,10 +1,14 @@
import { statSync } from 'fs';
import { intersects, validRange } from 'semver';
import { NodeVersion } from '../types';
import { NowBuildError } from '../errors';
import debug from '../debug';
export type NodeVersionMajor = ReturnType<typeof getOptions>[number]['major'];
function getOptions() {
const options = [
{ major: 20, range: '20.x', runtime: 'nodejs20.x' },
{ major: 18, range: '18.x', runtime: 'nodejs18.x' },
{
major: 16,
@@ -37,24 +41,47 @@ function getOptions() {
discontinueDate: new Date('2020-01-06'),
},
] as const;
if (process.env.VERCEL_ALLOW_NODEJS20 === '1') {
return [
{ major: 20, range: '20.x', runtime: 'nodejs20.x' },
...options,
] as const;
}
return options;
}
function getHint(isAuto = false) {
const { major, range } = getLatestNodeVersion();
function isNodeVersionAvailable(version: NodeVersion): boolean {
try {
return statSync(`/node${version.major}`).isDirectory();
} catch {
// ENOENT, or any other error, we don't care about
}
return false;
}
export function getAvailableNodeVersions(): NodeVersionMajor[] {
return getOptions()
.filter(isNodeVersionAvailable)
.map(n => n.major);
}
function getHint(isAuto = false, availableVersions?: NodeVersionMajor[]) {
const { major, range } = getLatestNodeVersion(availableVersions);
return isAuto
? `Please set Node.js Version to ${range} in your Project Settings to use Node.js ${major}.`
: `Please set "engines": { "node": "${range}" } in your \`package.json\` file to use Node.js ${major}.`;
}
export function getLatestNodeVersion() {
return getOptions()[0];
export function getLatestNodeVersion(availableVersions?: NodeVersionMajor[]) {
const all = getOptions();
if (availableVersions) {
// Return the first node version that is definitely
// available in the build-container.
for (const version of all) {
for (const major of availableVersions) {
if (version.major === major) {
return version;
}
}
}
}
// As a fallback for local `vc build` and the tests,
// return the first node version if none is found.
return all[0];
}
export function getDiscontinuedNodeVersions(): NodeVersion[] {
@@ -63,9 +90,10 @@ export function getDiscontinuedNodeVersions(): NodeVersion[] {
export async function getSupportedNodeVersion(
engineRange: string | undefined,
isAuto = false
isAuto = false,
availableVersions?: NodeVersionMajor[]
): Promise<NodeVersion> {
let selection: NodeVersion = getLatestNodeVersion();
let selection: NodeVersion | undefined;
if (engineRange) {
const found =
@@ -74,19 +102,29 @@ export async function getSupportedNodeVersion(
// the array is already in order so return the first
// match which will be the newest version of node
selection = o;
return intersects(o.range, engineRange);
return (
intersects(o.range, engineRange) &&
(availableVersions?.length
? availableVersions.includes(o.major)
: true)
);
});
if (!found) {
throw new NowBuildError({
code: 'BUILD_UTILS_NODE_VERSION_INVALID',
link: 'http://vercel.link/node-version',
message: `Found invalid Node.js Version: "${engineRange}". ${getHint(
isAuto
isAuto,
availableVersions
)}`,
});
}
}
if (!selection) {
selection = getLatestNodeVersion(availableVersions);
}
if (isDiscontinued(selection)) {
const intro = `Node.js Version "${selection.range}" is discontinued and must be upgraded.`;
throw new NowBuildError({

View File

@@ -9,7 +9,11 @@ import { deprecate } from 'util';
import debug from '../debug';
import { NowBuildError } from '../errors';
import { Meta, PackageJson, NodeVersion, Config } from '../types';
import { getSupportedNodeVersion, getLatestNodeVersion } from './node-version';
import {
getSupportedNodeVersion,
getLatestNodeVersion,
getAvailableNodeVersions,
} from './node-version';
import { readConfigFile } from './read-config-file';
import { cloneEnv } from '../clone-env';
@@ -238,9 +242,10 @@ export async function getNodeVersion(
destPath: string,
nodeVersionFallback = process.env.VERCEL_PROJECT_SETTINGS_NODE_VERSION,
config: Config = {},
meta: Meta = {}
meta: Meta = {},
availableVersions = getAvailableNodeVersions()
): Promise<NodeVersion> {
const latest = getLatestNodeVersion();
const latest = getLatestNodeVersion(availableVersions);
if (meta.isDev) {
// Use the system-installed version of `node` in PATH for `vercel dev`
return { ...latest, runtime: 'nodejs' };
@@ -266,7 +271,7 @@ export async function getNodeVersion(
nodeVersion = node;
isAuto = false;
}
return getSupportedNodeVersion(nodeVersion, isAuto);
return getSupportedNodeVersion(nodeVersion, isAuto, availableVersions);
}
export async function scanParentDirs(

View File

@@ -60,7 +60,7 @@ it('should only match supported node versions, otherwise throw an error', async
);
const autoMessage =
'Please set Node.js Version to 18.x in your Project Settings to use Node.js 18.';
'Please set Node.js Version to 20.x in your Project Settings to use Node.js 20.';
await expectBuilderError(
getSupportedNodeVersion('8.11.x', true),
autoMessage
@@ -80,7 +80,7 @@ it('should only match supported node versions, otherwise throw an error', async
);
const foundMessage =
'Please set "engines": { "node": "18.x" } in your `package.json` file to use Node.js 18.';
'Please set "engines": { "node": "20.x" } in your `package.json` file to use Node.js 20.';
await expectBuilderError(
getSupportedNodeVersion('8.11.x', false),
foundMessage
@@ -101,8 +101,8 @@ it('should match all semver ranges', async () => {
// See https://docs.npmjs.com/files/package.json#engines
expect(await getSupportedNodeVersion('16.0.0')).toHaveProperty('major', 16);
expect(await getSupportedNodeVersion('16.x')).toHaveProperty('major', 16);
expect(await getSupportedNodeVersion('>=10')).toHaveProperty('major', 18);
expect(await getSupportedNodeVersion('>=10.3.0')).toHaveProperty('major', 18);
expect(await getSupportedNodeVersion('>=10')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('>=10.3.0')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('16.5.0 - 16.9.0')).toHaveProperty(
'major',
16
@@ -120,11 +120,33 @@ it('should match all semver ranges', async () => {
});
it('should allow nodejs18.x', async () => {
expect(getLatestNodeVersion()).toHaveProperty('major', 18);
expect(await getSupportedNodeVersion('18.x')).toHaveProperty('major', 18);
expect(await getSupportedNodeVersion('18')).toHaveProperty('major', 18);
expect(await getSupportedNodeVersion('18.1.0')).toHaveProperty('major', 18);
expect(await getSupportedNodeVersion('>=16')).toHaveProperty('major', 18);
});
it('should allow nodejs20.x', async () => {
expect(getLatestNodeVersion()).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('20.x')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('20')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('20.1.0')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('>=18')).toHaveProperty('major', 20);
});
it('should not allow nodejs20.x when not available', async () => {
// Simulates AL2 build-container
await expect(
getSupportedNodeVersion('20.x', true, [14, 16, 18])
).rejects.toThrow(
'Found invalid Node.js Version: "20.x". Please set Node.js Version to 18.x in your Project Settings to use Node.js 18.'
);
});
it('should not allow nodejs18.x when not available', async () => {
// Simulates AL2023 build-container
await expect(getSupportedNodeVersion('18.x', true, [20])).rejects.toThrow(
'Found invalid Node.js Version: "18.x". Please set Node.js Version to 20.x in your Project Settings to use Node.js 20.'
);
});
it('should ignore node version in vercel dev getNodeVersion()', async () => {
@@ -193,7 +215,7 @@ it('should warn when package.json engines is greater than', async () => {
{},
{}
)
).toHaveProperty('range', '18.x');
).toHaveProperty('range', '20.x');
expect(warningMessages).toStrictEqual([
'Warning: Detected "engines": { "node": ">=16" } in your `package.json` that will automatically upgrade when a new major Node.js Version is released. Learn More: http://vercel.link/node-version',
]);
@@ -232,7 +254,17 @@ it('should not warn when package.json engines matches project setting from confi
});
it('should get latest node version', async () => {
expect(getLatestNodeVersion()).toHaveProperty('major', 18);
expect(getLatestNodeVersion()).toHaveProperty('major', 20);
});
it('should get latest node version with Node 18.x in build-container', async () => {
// Simulates AL2 build-container
expect(getLatestNodeVersion([14, 16, 18])).toHaveProperty('major', 18);
});
it('should get latest node version with Node 20.x in build-container', async () => {
// Simulates AL2023 build-container
expect(getLatestNodeVersion([20])).toHaveProperty('major', 20);
});
it('should throw for discontinued versions', async () => {
@@ -300,36 +332,19 @@ it('should warn for deprecated versions, soon to be discontinued', async () => {
16
);
expect(warningMessages).toStrictEqual([
'Error: Node.js version 10.x has reached End-of-Life. Deployments created on or after 2021-04-20 will fail to build. Please set "engines": { "node": "18.x" } in your `package.json` file to use Node.js 18.',
'Error: Node.js version 10.x has reached End-of-Life. Deployments created on or after 2021-04-20 will fail to build. Please set Node.js Version to 18.x in your Project Settings to use Node.js 18.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-03 will fail to build. Please set "engines": { "node": "18.x" } in your `package.json` file to use Node.js 18.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-03 will fail to build. Please set Node.js Version to 18.x in your Project Settings to use Node.js 18.',
'Error: Node.js version 14.x has reached End-of-Life. Deployments created on or after 2023-08-15 will fail to build. Please set "engines": { "node": "18.x" } in your `package.json` file to use Node.js 18.',
'Error: Node.js version 14.x has reached End-of-Life. Deployments created on or after 2023-08-15 will fail to build. Please set Node.js Version to 18.x in your Project Settings to use Node.js 18.',
'Error: Node.js version 16.x has reached End-of-Life. Deployments created on or after 2024-02-06 will fail to build. Please set "engines": { "node": "18.x" } in your `package.json` file to use Node.js 18.',
'Error: Node.js version 16.x has reached End-of-Life. Deployments created on or after 2024-02-06 will fail to build. Please set Node.js Version to 18.x in your Project Settings to use Node.js 18.',
'Error: Node.js version 10.x has reached End-of-Life. Deployments created on or after 2021-04-20 will fail to build. Please set "engines": { "node": "20.x" } in your `package.json` file to use Node.js 20.',
'Error: Node.js version 10.x has reached End-of-Life. Deployments created on or after 2021-04-20 will fail to build. Please set Node.js Version to 20.x in your Project Settings to use Node.js 20.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-03 will fail to build. Please set "engines": { "node": "20.x" } in your `package.json` file to use Node.js 20.',
'Error: Node.js version 12.x has reached End-of-Life. Deployments created on or after 2022-10-03 will fail to build. Please set Node.js Version to 20.x in your Project Settings to use Node.js 20.',
'Error: Node.js version 14.x has reached End-of-Life. Deployments created on or after 2023-08-15 will fail to build. Please set "engines": { "node": "20.x" } in your `package.json` file to use Node.js 20.',
'Error: Node.js version 14.x has reached End-of-Life. Deployments created on or after 2023-08-15 will fail to build. Please set Node.js Version to 20.x in your Project Settings to use Node.js 20.',
'Error: Node.js version 16.x has reached End-of-Life. Deployments created on or after 2024-02-06 will fail to build. Please set "engines": { "node": "20.x" } in your `package.json` file to use Node.js 20.',
'Error: Node.js version 16.x has reached End-of-Life. Deployments created on or after 2024-02-06 will fail to build. Please set Node.js Version to 20.x in your Project Settings to use Node.js 20.',
]);
global.Date.now = realDateNow;
});
it('should only allow nodejs20.x when env var is set', async () => {
try {
expect(getLatestNodeVersion()).toHaveProperty('major', 18);
await expect(getSupportedNodeVersion('20.x')).rejects.toThrow();
process.env.VERCEL_ALLOW_NODEJS20 = '1';
expect(getLatestNodeVersion()).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('20.x')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('20')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('20.1.0')).toHaveProperty('major', 20);
expect(await getSupportedNodeVersion('>=16')).toHaveProperty('major', 20);
} finally {
delete process.env.VERCEL_ALLOW_NODEJS20;
}
});
it('should support initialHeaders and initialStatus correctly', async () => {
new Prerender({
expiration: 1,

View File

@@ -1,5 +1,40 @@
# vercel
## 32.5.5
### Patch Changes
- Updated dependencies [[`88da7463c`](https://github.com/vercel/vercel/commit/88da7463ce12df91d49fbde85cb617030d55f558)]:
- @vercel/build-utils@7.2.5
- @vercel/node@3.0.11
- @vercel/static-build@2.0.12
## 32.5.4
### Patch Changes
- Updated dependencies [[`65dec5b7e`](https://github.com/vercel/vercel/commit/65dec5b7e752f4da8fe0ffdb25215170453f6f8b)]:
- @vercel/build-utils@7.2.4
- @vercel/node@3.0.10
- @vercel/static-build@2.0.11
## 32.5.3
### Patch Changes
- Handle `TooManyProjects` error in places where projects are created ([#10807](https://github.com/vercel/vercel/pull/10807))
- Updated dependencies [[`89c1e0323`](https://github.com/vercel/vercel/commit/89c1e032335d9ec0fcfc84fe499cf004fe73fafc), [`fd29b966d`](https://github.com/vercel/vercel/commit/fd29b966d39776318b0e11a53909edb43d1fc5f2)]:
- @vercel/node@3.0.9
- @vercel/next@4.0.14
## 32.5.2
### Patch Changes
- Updated dependencies [[`c94a082f6`](https://github.com/vercel/vercel/commit/c94a082f6bb1b84eaf420ac47ea83640dc83668e)]:
- @vercel/next@4.0.13
## 32.5.1
### Patch Changes

View File

@@ -1,6 +1,6 @@
{
"name": "vercel",
"version": "32.5.1",
"version": "32.5.5",
"preferGlobal": true,
"license": "Apache-2.0",
"description": "The command-line interface for Vercel",
@@ -31,17 +31,17 @@
"node": ">= 16"
},
"dependencies": {
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"@vercel/fun": "1.1.0",
"@vercel/go": "3.0.3",
"@vercel/hydrogen": "1.0.1",
"@vercel/next": "4.0.12",
"@vercel/node": "3.0.8",
"@vercel/next": "4.0.14",
"@vercel/node": "3.0.11",
"@vercel/python": "4.1.0",
"@vercel/redwood": "2.0.5",
"@vercel/remix-builder": "2.0.11",
"@vercel/ruby": "2.0.2",
"@vercel/static-build": "2.0.10",
"@vercel/static-build": "2.0.12",
"chokidar": "3.3.1"
},
"devDependencies": {
@@ -88,8 +88,8 @@
"@types/yauzl-promise": "2.1.0",
"@vercel-internals/constants": "1.0.4",
"@vercel-internals/get-package-json": "1.0.0",
"@vercel-internals/types": "1.0.14",
"@vercel/client": "13.0.7",
"@vercel-internals/types": "1.0.16",
"@vercel/client": "13.0.9",
"@vercel/error-utils": "2.0.2",
"@vercel/frameworks": "2.0.3",
"@vercel/fs-detectors": "5.1.3",

View File

@@ -3,6 +3,7 @@ import ms from 'ms';
import Client from '../../util/client';
import { isAPIError } from '../../util/errors-ts';
import { getCommandName } from '../../util/pkg-name';
import createProject from '../../util/projects/create-project';
export default async function add(
client: Client,
@@ -32,12 +33,14 @@ export default async function add(
const start = Date.now();
const [name] = args;
try {
await client.fetch('/projects', {
method: 'POST',
body: { name },
});
await createProject(client, { name });
} catch (err: unknown) {
if (isAPIError(err) && err.code === 'too_many_projects') {
output.prettyError(err);
return 1;
}
if (isAPIError(err) && err.status === 409) {
// project already exists, so we can
// show a success message

View File

@@ -21,6 +21,7 @@ import createProject from '../projects/create-project';
import { detectProjects } from '../projects/detect-projects';
import { repoInfoToUrl } from '../git/repo-info-to-url';
import { connectGitProvider, parseRepoUrl } from '../git/connect-git-provider';
import { isAPIError } from '../errors-ts';
const home = homedir();
@@ -283,24 +284,31 @@ export async function ensureRepoLink(
output.spinner(`Creating new Project: ${orgAndName}`);
delete selection.newProject;
if (!selection.rootDirectory) delete selection.rootDirectory;
const project = (selected[i] = await createProject(client, {
...selection,
framework: selection.framework.slug,
}));
await connectGitProvider(
client,
org,
project.id,
parsedRepoUrl.provider,
`${parsedRepoUrl.org}/${parsedRepoUrl.repo}`
);
output.log(
`Created new Project: ${output.link(
orgAndName,
`https://vercel.com/${orgAndName}`,
{ fallback: false }
)}`
);
try {
const project = (selected[i] = await createProject(client, {
...selection,
framework: selection.framework.slug,
}));
await connectGitProvider(
client,
org,
project.id,
parsedRepoUrl.provider,
`${parsedRepoUrl.org}/${parsedRepoUrl.repo}`
);
output.log(
`Created new Project: ${output.link(
orgAndName,
`https://vercel.com/${orgAndName}`,
{ fallback: false }
)}`
);
} catch (err) {
if (isAPIError(err) && err.code === 'too_many_projects') {
output.prettyError(err);
return;
}
}
}
repoConfig = {

View File

@@ -263,6 +263,10 @@ export default async function setupAndLink(
return { status: 'linked', org, project };
} catch (err) {
if (isAPIError(err) && err.code === 'too_many_projects') {
output.prettyError(err);
return { status: 'error', exitCode: 1, reason: 'TOO_MANY_PROJECTS' };
}
handleError(err);
return { status: 'error', exitCode: 1 };

View File

@@ -357,7 +357,7 @@ export function useProject(
pagination: null,
});
});
client.scenario.post(`/projects`, (req, res) => {
client.scenario.post(`/v1/projects`, (req, res) => {
const { name } = req.body;
if (name === project.name) {
res.json(project);

View File

@@ -1,5 +1,19 @@
# @vercel/client
## 13.0.9
### Patch Changes
- Updated dependencies [[`88da7463c`](https://github.com/vercel/vercel/commit/88da7463ce12df91d49fbde85cb617030d55f558)]:
- @vercel/build-utils@7.2.5
## 13.0.8
### Patch Changes
- Updated dependencies [[`65dec5b7e`](https://github.com/vercel/vercel/commit/65dec5b7e752f4da8fe0ffdb25215170453f6f8b)]:
- @vercel/build-utils@7.2.4
## 13.0.7
### Patch Changes

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/client",
"version": "13.0.7",
"version": "13.0.9",
"main": "dist/index.js",
"typings": "dist/index.d.ts",
"homepage": "https://vercel.com",
@@ -37,7 +37,7 @@
"typescript": "4.9.5"
},
"dependencies": {
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"@vercel/routing-utils": "3.1.0",
"@zeit/fetch": "5.2.0",
"async-retry": "1.2.3",

View File

@@ -37,7 +37,7 @@
"@types/minimatch": "3.0.5",
"@types/node": "14.18.33",
"@types/semver": "7.3.10",
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"jest-junit": "16.0.0",
"typescript": "4.9.5"
}

View File

@@ -1,5 +1,19 @@
# @vercel/gatsby-plugin-vercel-builder
## 2.0.11
### Patch Changes
- Updated dependencies [[`88da7463c`](https://github.com/vercel/vercel/commit/88da7463ce12df91d49fbde85cb617030d55f558)]:
- @vercel/build-utils@7.2.5
## 2.0.10
### Patch Changes
- Updated dependencies [[`65dec5b7e`](https://github.com/vercel/vercel/commit/65dec5b7e752f4da8fe0ffdb25215170453f6f8b)]:
- @vercel/build-utils@7.2.4
## 2.0.9
### Patch Changes

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/gatsby-plugin-vercel-builder",
"version": "2.0.9",
"version": "2.0.11",
"main": "dist/index.js",
"files": [
"dist",
@@ -20,7 +20,7 @@
},
"dependencies": {
"@sinclair/typebox": "0.25.24",
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"@vercel/routing-utils": "3.1.0",
"esbuild": "0.14.47",
"etag": "1.8.1",

View File

@@ -29,7 +29,7 @@
"@types/node-fetch": "^2.3.0",
"@types/tar": "6.1.5",
"@types/yauzl-promise": "2.1.0",
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"async-retry": "1.3.3",
"execa": "^1.0.0",
"fs-extra": "^7.0.0",

View File

@@ -26,7 +26,7 @@
"devDependencies": {
"@types/jest": "27.5.1",
"@types/node": "14.18.33",
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"execa": "3.2.0",
"fs-extra": "11.1.0",
"jest-junit": "16.0.0"

View File

@@ -1,5 +1,17 @@
# @vercel/next
## 4.0.14
### Patch Changes
- Fixed headers for static routes when PPR is enabled ([#10808](https://github.com/vercel/vercel/pull/10808))
## 4.0.13
### Patch Changes
- Added `getRequestHandlerWithMetadata` export ([#10753](https://github.com/vercel/vercel/pull/10753))
## 4.0.12
### Patch Changes

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/next",
"version": "4.0.12",
"version": "4.0.14",
"license": "Apache-2.0",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/next-js",
@@ -40,7 +40,7 @@
"@types/semver": "6.0.0",
"@types/text-table": "0.2.1",
"@types/webpack-sources": "3.2.0",
"@vercel/build-utils": "7.2.3",
"@vercel/build-utils": "7.2.5",
"@vercel/routing-utils": "3.1.0",
"async-sema": "3.0.1",
"buffer-crc32": "0.2.13",

View File

@@ -1605,6 +1605,7 @@ export const build: BuildV2 = async ({
// internal pages are already referenced in traces for serverless
// like builds
internalPages: [],
experimentalPPRRoutes: undefined,
});
const initialApiLambdaGroups = await getPageLambdaGroups({
@@ -1620,6 +1621,7 @@ export const build: BuildV2 = async ({
initialPseudoLayerUncompressed: 0,
lambdaCompressedByteLimit,
internalPages: [],
experimentalPPRRoutes: undefined,
});
for (const group of initialApiLambdaGroups) {
@@ -2108,6 +2110,7 @@ export const build: BuildV2 = async ({
static404Page,
pageLambdaMap,
lambdas,
experimentalStreamingLambdaPaths: undefined,
isServerMode,
prerenders,
entryDirectory,
@@ -2141,20 +2144,41 @@ export const build: BuildV2 = async ({
[
...Object.entries(prerenderManifest.fallbackRoutes),
...Object.entries(prerenderManifest.blockingFallbackRoutes),
].forEach(([, { dataRouteRegex, dataRoute }]) => {
if (!dataRoute || !dataRouteRegex) return;
].forEach(
([
,
{
dataRouteRegex,
dataRoute,
prefetchDataRouteRegex,
prefetchDataRoute,
},
]) => {
if (!dataRoute || !dataRouteRegex) return;
dataRoutes.push({
// Next.js provided data route regex
src: dataRouteRegex.replace(
/^\^/,
`^${appMountPrefixNoTrailingSlash}`
),
// Location of lambda in builder output
dest: path.posix.join(entryDirectory, dataRoute),
check: true,
});
});
dataRoutes.push({
// Next.js provided data route regex
src: dataRouteRegex.replace(
/^\^/,
`^${appMountPrefixNoTrailingSlash}`
),
// Location of lambda in builder output
dest: path.posix.join(entryDirectory, dataRoute),
check: true,
});
if (!prefetchDataRoute || !prefetchDataRouteRegex) return;
dataRoutes.push({
src: prefetchDataRouteRegex.replace(
/^\^/,
`^${appMountPrefixNoTrailingSlash}`
),
dest: path.posix.join(entryDirectory, prefetchDataRoute),
check: true,
});
}
);
}
}

View File

@@ -14,6 +14,7 @@ import {
Files,
Flag,
BuildResultV2Typical as BuildResult,
NodejsLambda,
} from '@vercel/build-utils';
import { Route, RouteWithHandle } from '@vercel/routing-utils';
import { MAX_AGE_ONE_YEAR } from '.';
@@ -50,6 +51,7 @@ import {
RSC_CONTENT_TYPE,
RSC_PREFETCH_SUFFIX,
normalizePrefetches,
CreateLambdaFromPseudoLayersOptions,
} from './utils';
import {
nodeFileTrace,
@@ -182,6 +184,10 @@ export async function serverBuild({
}
}
const experimental = {
ppr: requiredServerFilesManifest.config.experimental?.ppr === true,
};
let appRscPrefetches: UnwrapPromise<ReturnType<typeof glob>> = {};
let appBuildTraces: UnwrapPromise<ReturnType<typeof glob>> = {};
let appDir: string | null = null;
@@ -189,7 +195,11 @@ export async function serverBuild({
if (appPathRoutesManifest) {
appDir = path.join(pagesDir, '../app');
appBuildTraces = await glob('**/*.js.nft.json', appDir);
appRscPrefetches = await glob(`**/*${RSC_PREFETCH_SUFFIX}`, appDir);
// TODO: maybe?
appRscPrefetches = experimental.ppr
? {}
: await glob(`**/*${RSC_PREFETCH_SUFFIX}`, appDir);
const rscContentTypeHeader =
routesManifest?.rsc?.contentTypeHeader || RSC_CONTENT_TYPE;
@@ -295,6 +305,18 @@ export async function serverBuild({
internalPages.push('404.js');
}
const experimentalPPRRoutes = new Set<string>();
for (const [route, { experimentalPPR }] of [
...Object.entries(prerenderManifest.staticRoutes),
...Object.entries(prerenderManifest.blockingFallbackRoutes),
...Object.entries(prerenderManifest.fallbackRoutes),
]) {
if (!experimentalPPR) continue;
experimentalPPRRoutes.add(route);
}
const prerenderRoutes = new Set<string>([
...(canUsePreviewMode ? omittedPrerenderRoutes : []),
...Object.keys(prerenderManifest.blockingFallbackRoutes),
@@ -305,6 +327,8 @@ export async function serverBuild({
}),
]);
const experimentalStreamingLambdaPaths = new Map<string, string>();
if (hasLambdas) {
const initialTracingLabel = 'Traced Next.js server files in';
@@ -619,8 +643,8 @@ export async function serverBuild({
);
let launcher = launcherData
.replace(
'conf: __NEXT_CONFIG__',
`conf: ${JSON.stringify({
'const conf = __NEXT_CONFIG__',
`const conf = ${JSON.stringify({
...requiredServerFilesManifest.config,
distDir: path.relative(
projectDir,
@@ -823,6 +847,7 @@ export async function serverBuild({
prerenderRoutes,
pageTraces,
compressedPages,
experimentalPPRRoutes: undefined,
tracedPseudoLayer: tracedPseudoLayer.pseudoLayer,
initialPseudoLayer,
lambdaCompressedByteLimit,
@@ -843,6 +868,7 @@ export async function serverBuild({
prerenderRoutes,
pageTraces,
compressedPages,
experimentalPPRRoutes,
tracedPseudoLayer: tracedPseudoLayer.pseudoLayer,
initialPseudoLayer,
lambdaCompressedByteLimit,
@@ -859,6 +885,7 @@ export async function serverBuild({
prerenderRoutes,
pageTraces,
compressedPages,
experimentalPPRRoutes: undefined,
tracedPseudoLayer: tracedPseudoLayer.pseudoLayer,
initialPseudoLayer,
lambdaCompressedByteLimit,
@@ -868,7 +895,7 @@ export async function serverBuild({
});
for (const group of appRouterLambdaGroups) {
if (!group.isPrerenders) {
if (!group.isPrerenders || group.isExperimentalPPR) {
group.isStreaming = true;
}
group.isAppRouter = true;
@@ -890,6 +917,7 @@ export async function serverBuild({
prerenderRoutes,
pageTraces,
compressedPages,
experimentalPPRRoutes: undefined,
tracedPseudoLayer: tracedPseudoLayer.pseudoLayer,
initialPseudoLayer,
initialPseudoLayerUncompressed: uncompressedInitialSize,
@@ -1025,7 +1053,7 @@ export async function serverBuild({
};
const operationType = getOperationType({ group, prerenderManifest });
const lambda = await createLambdaFromPseudoLayers({
const options: CreateLambdaFromPseudoLayersOptions = {
files: {
...launcherFiles,
...updatedManifestFiles,
@@ -1041,7 +1069,30 @@ export async function serverBuild({
maxDuration: group.maxDuration,
isStreaming: group.isStreaming,
nextVersion,
});
};
const lambda = await createLambdaFromPseudoLayers(options);
// This is a PPR lambda if it's an App Page with the PPR experimental flag
// enabled.
const isPPR =
experimental.ppr && group.isAppRouter && !group.isAppRouteHandler;
// If PPR is enabled and this is an App Page, create the non-streaming
// lambda for the page for revalidation.
let revalidate: NodejsLambda | undefined;
if (isPPR) {
if (isPPR && !options.isStreaming) {
throw new Error("Invariant: PPR lambda isn't streaming");
}
// Create the non-streaming version of the same Lambda, this will be
// used for revalidation.
revalidate = await createLambdaFromPseudoLayers({
...options,
isStreaming: false,
});
}
for (const page of group.pages) {
const pageNoExt = page.replace(/\.js$/, '');
@@ -1057,11 +1108,35 @@ export async function serverBuild({
});
}
const outputName = normalizeIndexOutput(
let outputName = normalizeIndexOutput(
path.posix.join(entryDirectory, pageNoExt),
true
);
// If this is a PPR page, then we should prefix the output name.
if (isPPR) {
if (!revalidate) {
throw new Error("Invariant: PPR lambda isn't set");
}
// Get the get the base path prefixed route, without the index
// normalization.
outputName = path.posix.join(entryDirectory, pageNoExt);
lambdas[outputName] = revalidate;
const pprOutputName = path.posix.join(
entryDirectory,
'/_next/postponed/resume',
pageNoExt
);
lambdas[pprOutputName] = lambda;
// We want to add the `experimentalStreamingLambdaPath` to this
// output.
experimentalStreamingLambdaPaths.set(outputName, pprOutputName);
continue;
}
// we add locale prefixed outputs for SSR pages,
// this is handled in onPrerenderRoute for SSG pages
if (
@@ -1096,6 +1171,7 @@ export async function serverBuild({
pagesDir,
pageLambdaMap: {},
lambdas,
experimentalStreamingLambdaPaths,
prerenders,
entryDirectory,
routesManifest,
@@ -1111,23 +1187,32 @@ export async function serverBuild({
isEmptyAllowQueryForPrendered,
});
Object.keys(prerenderManifest.staticRoutes).forEach(route =>
prerenderRoute(route, {})
await Promise.all(
Object.keys(prerenderManifest.staticRoutes).map(route =>
prerenderRoute(route, {})
)
);
Object.keys(prerenderManifest.fallbackRoutes).forEach(route =>
prerenderRoute(route, { isFallback: true })
await Promise.all(
Object.keys(prerenderManifest.fallbackRoutes).map(route =>
prerenderRoute(route, { isFallback: true })
)
);
Object.keys(prerenderManifest.blockingFallbackRoutes).forEach(route =>
prerenderRoute(route, { isBlocking: true })
await Promise.all(
Object.keys(prerenderManifest.blockingFallbackRoutes).map(route =>
prerenderRoute(route, { isBlocking: true })
)
);
if (static404Page && canUsePreviewMode) {
omittedPrerenderRoutes.forEach(route => {
prerenderRoute(route, { isOmitted: true });
});
await Promise.all(
[...omittedPrerenderRoutes].map(route => {
return prerenderRoute(route, { isOmitted: true });
})
);
}
prerenderRoutes.forEach(route => {
if (experimentalPPRRoutes.has(route)) return;
if (routesManifest?.i18n) {
route = normalizeLocalePath(route, routesManifest.i18n.locales).pathname;
}
@@ -1164,7 +1249,8 @@ export async function serverBuild({
canUsePreviewMode,
prerenderManifest.bypassToken || '',
true,
middleware.dynamicRouteMap
middleware.dynamicRouteMap,
experimental.ppr
).then(arr =>
localizeDynamicRoutes(
arr,
@@ -1329,22 +1415,24 @@ export async function serverBuild({
// __rsc__ header is present
const edgeFunctions = middleware.edgeFunctions;
for (let route of Object.values(appPathRoutesManifest)) {
for (const route of Object.values(appPathRoutesManifest)) {
const ogRoute = inversedAppPathManifest[route];
if (ogRoute.endsWith('/route')) {
continue;
}
route = normalizeIndexOutput(
const pathname = normalizeIndexOutput(
path.posix.join('./', entryDirectory, route === '/' ? '/index' : route),
true
);
if (lambdas[route]) {
lambdas[`${route}.rsc`] = lambdas[route];
if (lambdas[pathname]) {
lambdas[`${pathname}.rsc`] = lambdas[pathname];
}
if (edgeFunctions[route]) {
edgeFunctions[`${route}.rsc`] = edgeFunctions[route];
if (edgeFunctions[pathname]) {
edgeFunctions[`${pathname}.rsc`] = edgeFunctions[pathname];
}
}
}
@@ -1364,6 +1452,10 @@ export async function serverBuild({
}))
: [];
if (experimental.ppr && !rscPrefetchHeader) {
throw new Error("Invariant: cannot use PPR without 'rsc.prefetchHeader'");
}
return {
wildcard: wildcardConfig,
images: getImagesConfig(imagesManifest),
@@ -1718,7 +1810,7 @@ export async function serverBuild({
]
: []),
...(rscPrefetchHeader
...(rscPrefetchHeader && !experimental.ppr
? [
{
src: path.posix.join(
@@ -1742,7 +1834,11 @@ export async function serverBuild({
entryDirectory,
`/(.+?)${RSC_PREFETCH_SUFFIX}(?:/)?$`
)}`,
dest: path.posix.join('/', entryDirectory, '/$1.rsc'),
dest: path.posix.join(
'/',
entryDirectory,
`/$1${experimental.ppr ? RSC_PREFETCH_SUFFIX : '.rsc'}`
),
has: [
{
type: 'header',
@@ -1955,8 +2051,6 @@ export async function serverBuild({
important: true,
},
// TODO: remove below workaround when `/` is allowed to be output
// different than `/index`
{
src: path.posix.join('/', entryDirectory, '/index'),
headers: {

View File

@@ -22,25 +22,44 @@ if (process.env.NODE_ENV !== 'production' && region !== 'dev1') {
// eslint-disable-next-line
const NextServer = require('__NEXT_SERVER_PATH__').default;
// __NEXT_CONFIG__ value is injected
declare const __NEXT_CONFIG__: any;
const conf = __NEXT_CONFIG__;
const nextServer = new NextServer({
// @ts-ignore __NEXT_CONFIG__ value is injected
conf: __NEXT_CONFIG__,
conf,
dir: '.',
minimalMode: true,
customServer: false,
});
const requestHandler = nextServer.getRequestHandler();
// Returns a wrapped handler that will crash the lambda if an error isn't
// caught.
const serve =
(handler: any) => async (req: IncomingMessage, res: ServerResponse) => {
try {
// @preserve entryDirectory handler
await handler(req, res);
} catch (err) {
console.error(err);
// crash the lambda immediately to clean up any bad module state,
// this was previously handled in ___vc_bridge on an unhandled rejection
// but we can do this quicker by triggering here
process.exit(1);
}
};
module.exports = async (req: IncomingMessage, res: ServerResponse) => {
try {
// @preserve entryDirectory handler
await requestHandler(req, res);
} catch (err) {
console.error(err);
// crash the lambda immediately to clean up any bad module state,
// this was previously handled in ___vc_bridge on an unhandled rejection
// but we can do this quicker by triggering here
process.exit(1);
}
};
// The default handler method should be exported as a function on the module.
module.exports = serve(nextServer.getRequestHandler());
// If available, add `getRequestHandlerWithMetadata` to the export if it's
// required by the configuration.
if (
conf.experimental?.ppr &&
'getRequestHandlerWithMetadata' in nextServer &&
typeof nextServer.getRequestHandlerWithMetadata === 'function'
) {
module.exports.getRequestHandlerWithMetadata = (metadata: any) =>
serve(nextServer.getRequestHandlerWithMetadata(metadata));
}

View File

@@ -15,6 +15,7 @@ import {
NodejsLambda,
EdgeFunction,
Images,
File,
} from '@vercel/build-utils';
import { NodeFileTraceReasons } from '@vercel/nft';
import type {
@@ -244,6 +245,7 @@ type RoutesManifestOld = {
header: string;
varyHeader: string;
prefetchHeader?: string;
didPostponeHeader?: string;
contentTypeHeader: string;
};
skipMiddlewareUrlNormalize?: boolean;
@@ -312,7 +314,8 @@ export async function getDynamicRoutes(
canUsePreviewMode?: boolean,
bypassToken?: string,
isServerMode?: boolean,
dynamicMiddlewareRouteMap?: Map<string, RouteWithSrc>
dynamicMiddlewareRouteMap?: Map<string, RouteWithSrc>,
experimentalPPR?: boolean
): Promise<RouteWithSrc[]> {
if (routesManifest) {
switch (routesManifest.version) {
@@ -385,6 +388,24 @@ export async function getDynamicRoutes(
},
];
}
if (experimentalPPR) {
let dest = route.dest?.replace(/($|\?)/, '.prefetch.rsc$1');
if (page === '/' || page === '/index') {
dest = dest?.replace(/([^/]+\.prefetch\.rsc(\?.*|$))/, '__$1');
}
routes.push({
...route,
src: route.src.replace(
new RegExp(escapeStringRegexp('(?:/)?$')),
'(?:\\.prefetch\\.rsc)(?:/)?$'
),
dest,
});
}
routes.push({
...route,
src: route.src.replace(
@@ -395,8 +416,8 @@ export async function getDynamicRoutes(
});
routes.push(route);
continue;
}
return routes;
}
default: {
@@ -778,7 +799,8 @@ export async function createPseudoLayer(files: {
return { pseudoLayer, pseudoLayerBytes };
}
interface CreateLambdaFromPseudoLayersOptions extends LambdaOptionsWithFiles {
export interface CreateLambdaFromPseudoLayersOptions
extends LambdaOptionsWithFiles {
layers: PseudoLayer[];
isStreaming?: boolean;
nextVersion?: string;
@@ -858,10 +880,12 @@ export type NextPrerenderedRoutes = {
[route: string]: {
initialRevalidate: number | false;
dataRoute: string | null;
prefetchDataRoute?: string | null;
srcRoute: string | null;
initialStatus?: number;
initialHeaders?: Record<string, string>;
experimentalBypassFor?: HasField;
experimentalPPR?: boolean;
};
};
@@ -870,7 +894,10 @@ export type NextPrerenderedRoutes = {
routeRegex: string;
dataRoute: string | null;
dataRouteRegex: string | null;
prefetchDataRoute?: string | null;
prefetchDataRouteRegex?: string | null;
experimentalBypassFor?: HasField;
experimentalPPR?: boolean;
};
};
@@ -880,7 +907,10 @@ export type NextPrerenderedRoutes = {
routeRegex: string;
dataRoute: string | null;
dataRouteRegex: string | null;
prefetchDataRoute?: string | null;
prefetchDataRouteRegex?: string | null;
experimentalBypassFor?: HasField;
experimentalPPR?: boolean;
};
};
@@ -889,7 +919,10 @@ export type NextPrerenderedRoutes = {
routeRegex: string;
dataRoute: string | null;
dataRouteRegex: string | null;
prefetchDataRoute: string | null | undefined;
prefetchDataRouteRegex: string | null | undefined;
experimentalBypassFor?: HasField;
experimentalPPR?: boolean;
};
};
@@ -1091,9 +1124,11 @@ export async function getPrerenderManifest(
initialRevalidateSeconds: number | false;
srcRoute: string | null;
dataRoute: string | null;
prefetchDataRoute: string | null | undefined;
initialStatus?: number;
initialHeaders?: Record<string, string>;
experimentalBypassFor?: HasField;
experimentalPPR?: boolean;
};
};
dynamicRoutes: {
@@ -1102,7 +1137,10 @@ export async function getPrerenderManifest(
fallback: string | false;
dataRoute: string | null;
dataRouteRegex: string | null;
prefetchDataRoute: string | null | undefined;
prefetchDataRouteRegex: string | null | undefined;
experimentalBypassFor?: HasField;
experimentalPPR?: boolean;
};
};
preview: {
@@ -1189,11 +1227,15 @@ export async function getPrerenderManifest(
let initialStatus: undefined | number;
let initialHeaders: undefined | Record<string, string>;
let experimentalBypassFor: undefined | HasField;
let experimentalPPR: undefined | boolean;
let prefetchDataRoute: undefined | string | null;
if (manifest.version === 4) {
initialStatus = manifest.routes[route].initialStatus;
initialHeaders = manifest.routes[route].initialHeaders;
experimentalBypassFor = manifest.routes[route].experimentalBypassFor;
experimentalPPR = manifest.routes[route].experimentalPPR;
prefetchDataRoute = manifest.routes[route].prefetchDataRoute;
}
ret.staticRoutes[route] = {
@@ -1202,10 +1244,12 @@ export async function getPrerenderManifest(
? false
: Math.max(1, initialRevalidateSeconds),
dataRoute,
prefetchDataRoute,
srcRoute,
initialStatus,
initialHeaders,
experimentalBypassFor,
experimentalPPR,
};
});
@@ -1213,35 +1257,52 @@ export async function getPrerenderManifest(
const { routeRegex, fallback, dataRoute, dataRouteRegex } =
manifest.dynamicRoutes[lazyRoute];
let experimentalBypassFor: undefined | HasField;
let experimentalPPR: undefined | boolean;
let prefetchDataRoute: undefined | string | null;
let prefetchDataRouteRegex: undefined | string | null;
if (manifest.version === 4) {
experimentalBypassFor =
manifest.dynamicRoutes[lazyRoute].experimentalBypassFor;
experimentalPPR = manifest.dynamicRoutes[lazyRoute].experimentalPPR;
prefetchDataRoute =
manifest.dynamicRoutes[lazyRoute].prefetchDataRoute;
prefetchDataRouteRegex =
manifest.dynamicRoutes[lazyRoute].prefetchDataRouteRegex;
}
if (typeof fallback === 'string') {
ret.fallbackRoutes[lazyRoute] = {
experimentalBypassFor,
experimentalPPR,
routeRegex,
fallback,
dataRoute,
dataRouteRegex,
prefetchDataRoute,
prefetchDataRouteRegex,
};
} else if (fallback === null) {
ret.blockingFallbackRoutes[lazyRoute] = {
experimentalBypassFor,
experimentalPPR,
routeRegex,
dataRoute,
dataRouteRegex,
prefetchDataRoute,
prefetchDataRouteRegex,
};
} else {
// Fallback behavior is disabled, all routes would've been provided
// in the top-level `routes` key (`staticRoutes`).
ret.omittedRoutes[lazyRoute] = {
experimentalBypassFor,
experimentalPPR,
routeRegex,
dataRoute,
dataRouteRegex,
prefetchDataRoute,
prefetchDataRouteRegex,
};
}
});
@@ -1417,6 +1478,7 @@ export type LambdaGroup = {
isAppRouteHandler?: boolean;
isStreaming?: boolean;
isPrerenders?: boolean;
isExperimentalPPR?: boolean;
isPages?: boolean;
isApiLambda: boolean;
pseudoLayer: PseudoLayer;
@@ -1430,6 +1492,7 @@ export async function getPageLambdaGroups({
functionsConfigManifest,
pages,
prerenderRoutes,
experimentalPPRRoutes,
pageTraces,
compressedPages,
tracedPseudoLayer,
@@ -1444,6 +1507,7 @@ export async function getPageLambdaGroups({
functionsConfigManifest?: FunctionsConfigManifestV1;
pages: string[];
prerenderRoutes: Set<string>;
experimentalPPRRoutes: Set<string> | undefined;
pageTraces: {
[page: string]: {
[key: string]: FileFsRef;
@@ -1465,6 +1529,7 @@ export async function getPageLambdaGroups({
const newPages = [...internalPages, page];
const routeName = normalizePage(page.replace(/\.js$/, ''));
const isPrerenderRoute = prerenderRoutes.has(routeName);
const isExperimentalPPR = experimentalPPRRoutes?.has(routeName) ?? false;
let opts: { memory?: number; maxDuration?: number } = {};
@@ -1494,7 +1559,8 @@ export async function getPageLambdaGroups({
const matches =
group.maxDuration === opts.maxDuration &&
group.memory === opts.memory &&
group.isPrerenders === isPrerenderRoute;
group.isPrerenders === isPrerenderRoute &&
group.isExperimentalPPR === isExperimentalPPR;
if (matches) {
let newTracedFilesSize = group.pseudoLayerBytes;
@@ -1533,6 +1599,7 @@ export async function getPageLambdaGroups({
pages: [page],
...opts,
isPrerenders: isPrerenderRoute,
isExperimentalPPR,
isApiLambda: !!isApiPage(page),
pseudoLayerBytes: initialPseudoLayer.pseudoLayerBytes,
pseudoLayerUncompressedBytes: initialPseudoLayerUncompressed,
@@ -1831,7 +1898,8 @@ type OnPrerenderRouteArgs = {
isServerMode: boolean;
canUsePreviewMode: boolean;
lambdas: { [key: string]: Lambda };
prerenders: { [key: string]: Prerender | FileFsRef };
experimentalStreamingLambdaPaths: Map<string, string> | undefined;
prerenders: { [key: string]: Prerender | File };
pageLambdaMap: { [key: string]: string };
routesManifest?: RoutesManifest;
isCorrectNotFoundRoutes?: boolean;
@@ -1841,7 +1909,7 @@ let prerenderGroup = 1;
export const onPrerenderRoute =
(prerenderRouteArgs: OnPrerenderRouteArgs) =>
(
async (
routeKey: string,
{
isBlocking,
@@ -1866,6 +1934,7 @@ export const onPrerenderRoute =
isServerMode,
canUsePreviewMode,
lambdas,
experimentalStreamingLambdaPaths,
prerenders,
pageLambdaMap,
routesManifest,
@@ -1926,9 +1995,11 @@ export const onPrerenderRoute =
let initialRevalidate: false | number;
let srcRoute: string | null;
let dataRoute: string | null;
let prefetchDataRoute: string | null | undefined;
let initialStatus: number | undefined;
let initialHeaders: Record<string, string> | undefined;
let experimentalBypassFor: HasField | undefined;
let experimentalPPR: boolean | undefined;
if (isFallback || isBlocking) {
const pr = isFallback
@@ -1946,12 +2017,18 @@ export const onPrerenderRoute =
srcRoute = null;
dataRoute = pr.dataRoute;
experimentalBypassFor = pr.experimentalBypassFor;
experimentalPPR = pr.experimentalPPR;
prefetchDataRoute = pr.prefetchDataRoute;
} else if (isOmitted) {
initialRevalidate = false;
srcRoute = routeKey;
dataRoute = prerenderManifest.omittedRoutes[routeKey].dataRoute;
experimentalBypassFor =
prerenderManifest.omittedRoutes[routeKey].experimentalBypassFor;
experimentalPPR =
prerenderManifest.omittedRoutes[routeKey].experimentalPPR;
prefetchDataRoute =
prerenderManifest.omittedRoutes[routeKey].prefetchDataRoute;
} else {
const pr = prerenderManifest.staticRoutes[routeKey];
({
@@ -1961,19 +2038,71 @@ export const onPrerenderRoute =
initialHeaders,
initialStatus,
experimentalBypassFor,
experimentalPPR,
prefetchDataRoute,
} = pr);
}
let isAppPathRoute = false;
// TODO: leverage manifest to determine app paths more accurately
if (appDir && srcRoute && (!dataRoute || dataRoute?.endsWith('.rsc'))) {
isAppPathRoute = true;
}
const isOmittedOrNotFound = isOmitted || isNotFound;
let htmlFsRef: FileFsRef | null;
let htmlFsRef: File | null;
if (appDir && !dataRoute && isAppPathRoute && !(isBlocking || isFallback)) {
// If enabled, try to get the postponed route information from the file
// system and use it to assemble the prerender.
let prerender: string | undefined;
if (experimentalPPR && appDir) {
const htmlPath = path.join(appDir, `${routeFileNoExt}.html`);
const metaPath = path.join(appDir, `${routeFileNoExt}.meta`);
if (fs.existsSync(htmlPath) && fs.existsSync(metaPath)) {
const meta = JSON.parse(await fs.readFile(metaPath, 'utf8'));
if ('postponed' in meta && typeof meta.postponed === 'string') {
prerender = meta.postponed;
// Assign the headers Content-Type header to the prerendered type.
initialHeaders ??= {};
initialHeaders[
'content-type'
] = `application/x-nextjs-pre-render; state-length=${meta.postponed.length}`;
// Read the HTML file and append it to the prerendered content.
const html = await fs.readFileSync(htmlPath, 'utf8');
prerender += html;
}
}
if (!dataRoute?.endsWith('.rsc')) {
throw new Error(
`Invariant: unexpected output path for ${dataRoute} and PPR`
);
}
if (!prefetchDataRoute?.endsWith('.prefetch.rsc')) {
throw new Error(
`Invariant: unexpected output path for ${prefetchDataRoute} and PPR`
);
}
}
if (prerender) {
const contentType = initialHeaders?.['content-type'];
if (!contentType) {
throw new Error("Invariant: contentType can't be undefined");
}
// Assemble the prerendered file.
htmlFsRef = new FileBlob({ contentType, data: prerender });
} else if (
appDir &&
!dataRoute &&
isAppPathRoute &&
!(isBlocking || isFallback)
) {
const contentType = initialHeaders?.['content-type'];
htmlFsRef = new FileFsRef({
fsPath: path.join(appDir, `${routeFileNoExt}.body`),
@@ -2023,7 +2152,7 @@ export const onPrerenderRoute =
? addLocaleOrDefault('/404.html', routesManifest, locale)
: '/404.html'
: isAppPathRoute
? dataRoute
? prefetchDataRoute || dataRoute
: routeFileNoExt + '.json'
}`
),
@@ -2054,13 +2183,12 @@ export const onPrerenderRoute =
);
let lambda: undefined | Lambda;
let outputPathData: null | string = null;
if (dataRoute) {
outputPathData = path.posix.join(entryDirectory, dataRoute);
function normalizeDataRoute(route: string) {
let normalized = path.posix.join(entryDirectory, route);
if (nonDynamicSsg || isFallback || isOmitted) {
outputPathData = outputPathData.replace(
normalized = normalized.replace(
new RegExp(`${escapeStringRegexp(origRouteFileNoExt)}.json$`),
// ensure we escape "$" correctly while replacing as "$" is a special
// character, we need to do double escaping as first is for the initial
@@ -2068,8 +2196,32 @@ export const onPrerenderRoute =
`${routeFileNoExt.replace(/\$/g, '$$$$')}.json`
);
}
return normalized;
}
let outputPathData: null | string = null;
if (dataRoute) {
outputPathData = normalizeDataRoute(dataRoute);
}
let outputPathPrefetchData: null | string = null;
if (prefetchDataRoute) {
if (!experimentalPPR) {
throw new Error(
"Invariant: prefetchDataRoute can't be set without PPR"
);
}
outputPathPrefetchData = normalizeDataRoute(prefetchDataRoute);
} else if (experimentalPPR) {
throw new Error('Invariant: expected to find prefetch data route PPR');
}
// When the prefetch data path is available, use it for the prerender,
// otherwise use the data path.
const outputPrerenderPathData = outputPathPrefetchData || outputPathData;
if (isSharedLambdas) {
const outputSrcPathPage = normalizeIndexOutput(
path.join(
@@ -2117,8 +2269,8 @@ export const onPrerenderRoute =
htmlFsRef.contentType = htmlContentType;
prerenders[outputPathPage] = htmlFsRef;
if (outputPathData) {
prerenders[outputPathData] = jsonFsRef;
if (outputPrerenderPathData) {
prerenders[outputPrerenderPathData] = jsonFsRef;
}
}
}
@@ -2188,12 +2340,32 @@ export const onPrerenderRoute =
'RSC, Next-Router-State-Tree, Next-Router-Prefetch';
const rscContentTypeHeader =
routesManifest?.rsc?.contentTypeHeader || RSC_CONTENT_TYPE;
const rscDidPostponeHeader = routesManifest?.rsc?.didPostponeHeader;
let sourcePath: string | undefined;
if (`/${outputPathPage}` !== srcRoute && srcRoute) {
sourcePath = srcRoute;
}
// The `experimentalStreamingLambdaPaths` stores the page without the
// leading `/` and with the `/` rewritten to be `index`. We should
// normalize the key so that it matches that key in the map.
let key = srcRoute || routeKey;
if (key === '/') {
key = 'index';
} else {
if (!key.startsWith('/')) {
throw new Error("Invariant: key doesn't start with /");
}
key = key.substring(1);
}
key = path.posix.join(entryDirectory, key);
const experimentalStreamingLambdaPath =
experimentalStreamingLambdaPaths?.get(key);
prerenders[outputPathPage] = new Prerender({
expiration: initialRevalidate,
lambda,
@@ -2205,6 +2377,7 @@ export const onPrerenderRoute =
initialStatus,
initialHeaders,
sourcePath,
experimentalStreamingLambdaPath,
...(isNotFound
? {
@@ -2222,8 +2395,21 @@ export const onPrerenderRoute =
: {}),
});
if (outputPathData) {
prerenders[outputPathData] = new Prerender({
if (outputPrerenderPathData) {
let normalizedPathData = outputPrerenderPathData;
if (
(srcRoute === '/' || srcRoute == '/index') &&
outputPrerenderPathData.endsWith(RSC_PREFETCH_SUFFIX)
) {
delete lambdas[normalizedPathData];
normalizedPathData = normalizedPathData.replace(
/([^/]+\.prefetch\.rsc)$/,
'__$1'
);
}
prerenders[normalizedPathData] = new Prerender({
expiration: initialRevalidate,
lambda,
allowQuery,
@@ -2243,6 +2429,10 @@ export const onPrerenderRoute =
initialHeaders: {
'content-type': rscContentTypeHeader,
vary: rscVaryHeader,
// If it contains a pre-render, then it was postponed.
...(prerender && rscDidPostponeHeader
? { [rscDidPostponeHeader]: '1' }
: {}),
},
}
: {}),

View File

@@ -11,7 +11,7 @@ describe(`${__dirname.split(path.sep).pop()}`, () => {
afterAll(() => fs.remove(fixtureDir));
it('should deploy and pass probe checks', async () => {
await fs.copy(path.join(__dirname, '../00-app-dir'), fixtureDir);
await fs.copy(path.join(__dirname, '../00-app-dir-no-ppr'), fixtureDir);
const nextConfigPath = path.join(fixtureDir, 'next.config.js');
await fs.writeFile(

View File

@@ -0,0 +1,8 @@
{
"dependencies": {
"next": "13.5.6",
"react": "experimental",
"react-dom": "experimental"
},
"ignoreNextjsUpdates": true
}

View File

@@ -0,0 +1,13 @@
import React, { Suspense } from 'react'
import { Dynamic } from '../../../components/dynamic'
export const dynamic = 'force-dynamic'
export const revalidate = 60
export default ({ params: { slug } }) => {
return (
<Suspense fallback={<Dynamic pathname={`/dynamic/force-dynamic/${slug}`} fallback />}>
<Dynamic pathname={`/dynamic/force-dynamic/${slug}`} />
</Suspense>
)
}

View File

@@ -0,0 +1,13 @@
import React, { Suspense } from 'react'
import { Dynamic } from '../../../components/dynamic'
export const dynamic = 'force-static'
export const revalidate = 60
export default ({ params: { slug } }) => {
return (
<Suspense fallback={<Dynamic pathname={`/dynamic/force-static/${slug}`} fallback />}>
<Dynamic pathname={`/dynamic/force-static/${slug}`} />
</Suspense>
)
}

View File

@@ -0,0 +1,19 @@
import { Links } from '../components/links';
export default ({ children }) => {
return (
<html>
<body>
<h1>Partial Prerendering</h1>
<p>
Below are links that are associated with different pages that all will
partially prerender
</p>
<aside>
<Links />
</aside>
<main>{children}</main>
</body>
</html>
);
};

View File

@@ -0,0 +1,16 @@
import React, { Suspense } from 'react'
import { Dynamic } from '../../../components/dynamic'
export const revalidate = 60
export default ({ params: { slug } }) => {
return (
<Suspense fallback={<Dynamic pathname={`/loading/${slug}`} fallback />}>
<Dynamic pathname={`/loading/${slug}`} />
</Suspense>
)
}
export const generateStaticParams = async () => {
return [{ slug: 'a' }]
}

View File

@@ -0,0 +1,3 @@
export default () => {
return <code>loading.jsx</code>
}

View File

@@ -0,0 +1,16 @@
import React, { Suspense } from 'react'
import { Dynamic } from '../../../components/dynamic'
export const revalidate = 60
export default ({ params: { slug } }) => {
return (
<Suspense fallback={<Dynamic pathname={`/nested/${slug}`} fallback />}>
<Dynamic pathname={`/nested/${slug}`} />
</Suspense>
)
}
export const generateStaticParams = async () => {
return [{ slug: 'a' }]
}

View File

@@ -0,0 +1,10 @@
import React from 'react'
import { Dynamic } from '../../../../components/dynamic'
export default ({ params: { slug } }) => {
return <Dynamic pathname={`/no-suspense/nested/${slug}`} />
}
export const generateStaticParams = async () => {
return [{ slug: 'a' }]
}

View File

@@ -0,0 +1,6 @@
import React from 'react'
import { Dynamic } from '../../components/dynamic'
export default () => {
return <Dynamic pathname="/no-suspense" />
}

View File

@@ -0,0 +1,10 @@
import React, { Suspense } from 'react'
import { Dynamic } from '../../../components/dynamic'
export default ({ params: { slug } }) => {
return (
<Suspense fallback={<Dynamic pathname={`/on-demand/${slug}`} fallback />}>
<Dynamic pathname={`/on-demand/${slug}`} />
</Suspense>
)
}

Some files were not shown because too many files have changed in this diff Show More