Compare commits

...

36 Commits

Author SHA1 Message Date
Leo Lamprecht
34f4222ca2 Publish Canary
- @vercel/build-utils@2.12.3-canary.34
 - vercel@23.1.3-canary.56
 - @vercel/client@10.2.3-canary.35
 - vercel-plugin-middleware@0.0.0-canary.10
 - vercel-plugin-go@1.0.0-canary.22
 - vercel-plugin-node@1.12.2-canary.26
 - vercel-plugin-python@1.0.0-canary.23
 - vercel-plugin-ruby@1.0.0-canary.21
2021-12-03 20:15:25 +01:00
Leo Lamprecht
5de045edd7 Use correct Lambda handler for CLI Plugins (#7128)
* Use correct Lambda handler for CLI Plugins

* Tweaked comment

* Fixed tests
2021-12-03 20:15:05 +01:00
Leo Lamprecht
5efd3b98de Publish Canary
- @vercel/build-utils@2.12.3-canary.33
 - vercel@23.1.3-canary.55
 - @vercel/client@10.2.3-canary.34
 - vercel-plugin-middleware@0.0.0-canary.9
 - vercel-plugin-go@1.0.0-canary.21
 - vercel-plugin-node@1.12.2-canary.25
 - vercel-plugin-python@1.0.0-canary.22
 - vercel-plugin-ruby@1.0.0-canary.20
2021-12-03 19:10:29 +01:00
Leo Lamprecht
82c83312c7 Fixed Legacy Runtime output generation (#7127)
* Corrected CLI Plugin output creation

* Wait for the linking to be done before removing anything

* Make it work

* Renamed variables

* Made everything work

* Removed debugging

* Fixed comment typos
2021-12-03 19:09:59 +01:00
Andy Bitz
5ccb983007 Publish Canary
- vercel@23.1.3-canary.54
 - vercel-plugin-middleware@0.0.0-canary.8
2021-12-03 12:12:03 +01:00
Steven
7a921399be [middleware] Fix dependencies (#7125) 2021-12-02 20:38:07 -05:00
Andy
3900f2f982 [cli] Support for next export with vercel build (#7122)
* [cli] Support `next export` in `vercel build`

* Add debug line

* Remove unused import

* Ensure file is copied

* Return in `getNextExportStatus`

* Include dotNextDir

Co-authored-by: Steven <steven@ceriously.com>
2021-12-02 20:31:17 -05:00
Steven
09939f1e07 Update github codeowners (#7123)
* Add gary and javi

* Remove timothy and coetry

* Remove rdev

* Add jaredpalmer

Co-authored-by: Nathan Rajlich <n@n8.io>
2021-12-02 19:20:18 -05:00
Leo Lamprecht
fc3a3ca81f Use Functions Manifest for Middleware (#7119)
### Related Issues

This fixes https://github.com/vercel/runtimes/issues/299.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-03 00:12:01 +00:00
Leo Lamprecht
ba7bf2e4a6 Publish Canary
- @vercel/build-utils@2.12.3-canary.32
 - vercel@23.1.3-canary.53
 - @vercel/client@10.2.3-canary.33
 - vercel-plugin-go@1.0.0-canary.20
 - vercel-plugin-node@1.12.2-canary.24
 - vercel-plugin-python@1.0.0-canary.21
 - vercel-plugin-ruby@1.0.0-canary.19
2021-12-02 23:51:51 +01:00
Leo Lamprecht
00641037fc Corrected input paths for CLI Plugins (#7121)
* Use correct paths for outputs

* Fixed tests

This reverts commit 7c4baeaafaf41609f47c97a09f5e9647fd8b89ee.

* Revert "Fixed tests"

This reverts commit 59c10d18c63f8404c3b0c361c3769b62524316f1.

* Revert "Use correct paths for outputs"

This reverts commit 23a0b34fad1e4932755a39975ae1dfa07acb2dd9.

* Corrected input paths for CLI Plugins

* Fixed tests

This reverts commit 7c4baeaafaf41609f47c97a09f5e9647fd8b89ee.

* Revert "Fixed tests"

This reverts commit 9612d2a9eb19240a5a4489406ada17a6a5bb3806.

* Fixed tests

* Delete vc__handler__python.py
2021-12-02 23:51:39 +01:00
Leo Lamprecht
6f4a1b527b Publish Canary
- @vercel/build-utils@2.12.3-canary.31
 - vercel@23.1.3-canary.52
 - @vercel/client@10.2.3-canary.32
 - vercel-plugin-go@1.0.0-canary.19
 - vercel-plugin-node@1.12.2-canary.23
 - vercel-plugin-python@1.0.0-canary.20
 - vercel-plugin-ruby@1.0.0-canary.18
2021-12-02 22:41:17 +01:00
Leo Lamprecht
1b95576dd2 Fixed Go, Ruby, and Python CLI Plugin output generation (#7117)
* Fixed error with API directory

* Made output work

* Use handler as API Route

* Correctly find the handler

* Fixed a missing instance

* Made handler logic work

* Made it work as expected

* Exclude unnecessary files

* Use a method that always works

* Additional comment

* Made everything work

* Cleaner tests

* Clean up all the useless files

* Fixed missing instance

* Speed up the code

* Removed useless lines

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>

* Clarified comment

* Use relative logic again

* Fixed tests

* Deleted useless file

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>
2021-12-02 22:18:43 +01:00
Leo Lamprecht
9227471aca Publish Canary
- @vercel/build-utils@2.12.3-canary.30
 - vercel@23.1.3-canary.51
 - @vercel/client@10.2.3-canary.31
 - vercel-plugin-go@1.0.0-canary.18
 - vercel-plugin-node@1.12.2-canary.22
 - vercel-plugin-python@1.0.0-canary.19
 - vercel-plugin-ruby@1.0.0-canary.17
2021-12-02 15:24:34 +01:00
Leo Lamprecht
bf060296eb Fixed typo (#7115) 2021-12-02 15:19:32 +01:00
Leo Lamprecht
9b3aa41f2e Publish Canary
- @vercel/build-utils@2.12.3-canary.29
 - vercel@23.1.3-canary.50
 - @vercel/client@10.2.3-canary.30
 - vercel-plugin-go@1.0.0-canary.17
 - vercel-plugin-node@1.12.2-canary.21
 - vercel-plugin-python@1.0.0-canary.18
 - vercel-plugin-ruby@1.0.0-canary.16
2021-12-02 15:02:00 +01:00
Leo Lamprecht
ae36585cdb Filter CLI Plugin output (#7114)
* Filter CLI Plugin output

* Only apply ignorefile check in the beginning
2021-12-02 15:01:17 +01:00
Leo Lamprecht
e4c636ddd2 Publish Canary
- vercel-plugin-go@1.0.0-canary.16
 - vercel-plugin-python@1.0.0-canary.17
 - vercel-plugin-ruby@1.0.0-canary.15
2021-12-02 14:14:14 +01:00
Leo Lamprecht
ae3b25be4b Made CLI Plugin publishing work (#7113) 2021-12-02 14:10:23 +01:00
Andy Bitz
a64ed13a40 Publish Canary
- vercel@23.1.3-canary.49
2021-12-02 12:55:04 +01:00
Andy
6c1c0e6676 [cli] Ignore required-server-files.json if it does not exist (#7111) 2021-12-02 12:54:30 +01:00
Leo Lamprecht
82fdd5d121 Publish Canary
- vercel@23.1.3-canary.48
 - vercel-plugin-go@1.0.0-canary.15
 - vercel-plugin-node@1.12.2-canary.20
 - vercel-plugin-python@1.0.0-canary.16
 - vercel-plugin-ruby@1.0.0-canary.14
 - @vercel/static-config@0.0.1-canary.1
2021-12-02 11:46:28 +01:00
Nathan Rajlich
8b40f4435e [api] Use the new File System API (#7108)
Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com>
2021-12-02 11:45:57 +01:00
Leo Lamprecht
38c87602bb Renamed runtime to use in static JS config (#7106)
### Related Issues

This applies what was mentioned in https://github.com/vercel/runtimes/issues/288#issuecomment-984101750.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-02 00:37:09 +00:00
Nathan Rajlich
7aef3013e7 [cli] Use "127.0.0.1" instead of "localhost" in vc dev (#7094)
Node.js doesn't like when a hostname resolves to an IPv6 address (https://stackoverflow.com/a/15244890/376773) so use the IPv4 localhost IP address instead. Specifically this fixes vc dev on Node.js 17 which now prefers IPv6 by default.

Slack thread: https://vercel.slack.com/archives/C01A2M9R8RZ/p1638330248263400
2021-12-01 23:29:26 +00:00
Nathan Rajlich
c18676ab4d Publish Canary
- vercel-plugin-go@1.0.0-canary.14
 - vercel-plugin-python@1.0.0-canary.15
 - vercel-plugin-ruby@1.0.0-canary.13
2021-12-01 14:28:40 -08:00
Leo Lamprecht
df450c815d Properly publish CLI Plugins (#7103)
After https://github.com/vercel/vercel/pull/7088, `dist` now contains `package.json`, which made `tsc` move `index.js` a level deeper, effectively breaking the `main` property of all of the affected CLI Plugins.

This change makes them work again. 

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-01 20:59:46 +00:00
Andy Bitz
792ab38760 Publish Canary
- @vercel/build-utils@2.12.3-canary.28
 - vercel@23.1.3-canary.47
 - @vercel/client@10.2.3-canary.29
 - vercel-plugin-go@1.0.0-canary.13
 - vercel-plugin-node@1.12.2-canary.19
 - vercel-plugin-python@1.0.0-canary.14
 - vercel-plugin-ruby@1.0.0-canary.12
 - @vercel/ruby@1.2.8-canary.6
2021-12-01 15:32:05 +01:00
Leo Lamprecht
0bba3e76c1 Corrected dependency installation systems (#7088)
* Avoid unnecessary Gemfile installations

* Do not bundle `.output` or `.vercel` into Lambdas

* Use the same input hash format for all CLI Plugins and `vercel build`

* Fixed unit tests

* Fixed the unit tests again

* Fixed the unit tests

* Fixed all the tests

* Exclude useless files

* Consider `.vercelignore` and `.nowignore`

* Fixed error

* Reverted changes

* Deleted useless file

* Fixed tests

* Share input hash format with `vercel-plugin-node`

* Make output inspectable

* Fixed build error

* Extended comment

* Bump Ruby version

* Update Gemfiles

* Update bundles

* Fixed tests

Co-authored-by: Andy Bitz <artzbitz@gmail.com>
2021-12-01 15:31:27 +01:00
Andy Bitz
3d961ffbb9 Publish Canary
- vercel@23.1.3-canary.46
2021-11-30 19:25:13 +01:00
Andy
a3039f57bb [cli] Fix vc build with nested app dir (#7083)
* [cli] Fix `vc build` with nested app dir

* Set NEXT_PRIVATE_OUTPUT_TRACE_ROOT
2021-11-30 18:30:51 +01:00
Andy Bitz
5499fa9a04 Publish Canary
- @vercel/build-utils@2.12.3-canary.27
 - vercel@23.1.3-canary.45
 - @vercel/client@10.2.3-canary.28
 - vercel-plugin-go@1.0.0-canary.12
 - vercel-plugin-node@1.12.2-canary.18
 - vercel-plugin-python@1.0.0-canary.13
 - vercel-plugin-ruby@1.0.0-canary.11
2021-11-30 16:17:47 +01:00
Leo Lamprecht
b9fd64faff Stop passing functions and regions to Runtimes (#7085)
* Stop passing `functions` and `regions` to Runtimes

* Added required types back

* Removed `vercelConfig` from `vercel-plugin-node`

* Fixed unit test

* Fixed test
2021-11-30 16:16:53 +01:00
Andy Bitz
1202ff7b2b Publish Canary
- @vercel/build-utils@2.12.3-canary.26
 - vercel@23.1.3-canary.44
 - @vercel/client@10.2.3-canary.27
 - @vercel/frameworks@0.5.1-canary.16
 - vercel-plugin-go@1.0.0-canary.11
 - vercel-plugin-node@1.12.2-canary.17
 - vercel-plugin-python@1.0.0-canary.12
 - vercel-plugin-ruby@1.0.0-canary.10
 - @vercel/python@2.1.2-canary.1
 - @vercel/ruby@1.2.8-canary.5
2021-11-30 14:37:53 +01:00
Leo Lamprecht
abd9f019f1 Don't install Ruby or Python dependencies unnecessarily (#7084)
* Don't install Ruby or Python dependencies unnecessarily

* Less repeated code

* Cleaner code
2021-11-30 14:36:27 +01:00
Leo Lamprecht
edb5eead81 Speed up Remix Template (#7077)
* Replaced Remix Template

* Added npm lockfile for Remix Template

* Added npm lockfile for Next.js Template

* Added Remix logo
2021-11-29 16:01:23 +01:00
123 changed files with 24917 additions and 3503 deletions

40
.github/CODEOWNERS vendored
View File

@@ -4,24 +4,26 @@
* @TooTallNate
/.github/workflows @AndyBitz @styfle
/packages/frameworks @AndyBitz
/packages/cli/src/commands/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/util/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/commands/domains @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/certs @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/env @styfle @lucleray
/packages/client @rdev @styfle @TooTallNate
/packages/build-utils @styfle @AndyBitz @TooTallNate
/packages/node @styfle @TooTallNate @lucleray
/packages/node-bridge @styfle @TooTallNate @lucleray
/packages/next @Timer @ijjk
/packages/go @styfle @TooTallNate
/packages/python @styfle @TooTallNate
/packages/ruby @styfle @coetry @TooTallNate
/packages/static-build @styfle @AndyBitz
/packages/routing-utils @styfle @dav-is @ijjk
/examples @mcsdevv @timothyis
/packages/cli/src/commands/build @TooTallNate @styfle @AndyBitz @gdborton @jaredpalmer
/packages/cli/src/commands/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/util/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/commands/domains @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/certs @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/env @styfle @lucleray
/packages/client @styfle @TooTallNate
/packages/build-utils @styfle @AndyBitz @TooTallNate
/packages/middleware @gdborton @javivelasco
/packages/node @styfle @TooTallNate @lucleray
/packages/node-bridge @styfle @TooTallNate @lucleray
/packages/next @Timer @ijjk
/packages/go @styfle @TooTallNate
/packages/python @styfle @TooTallNate
/packages/ruby @styfle @TooTallNate
/packages/static-build @styfle @AndyBitz
/packages/routing-utils @styfle @dav-is @ijjk
/examples @mcsdevv
/examples/create-react-app @Timer
/examples/nextjs @timneutkens @Timer
/examples/hugo @mcsdevv @timothyis @styfle
/examples/jekyll @mcsdevv @timothyis @styfle
/examples/zola @mcsdevv @timothyis @styfle
/examples/hugo @mcsdevv @styfle
/examples/jekyll @mcsdevv @styfle
/examples/zola @mcsdevv @styfle

View File

@@ -5,7 +5,7 @@
"description": "API for the vercel/vercel repo",
"main": "index.js",
"scripts": {
"vercel-build": "yarn --cwd .. && node ../utils/run.js build all"
"vercel-build": "node ../utils/run.js build all"
},
"dependencies": {
"@sentry/node": "5.11.1",

15787
examples/nextjs/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -5,4 +5,4 @@ node_modules
.output
public/build
api/build
api/_build

View File

@@ -1,5 +1,5 @@
const { createRequestHandler } = require("@remix-run/vercel");
module.exports = createRequestHandler({
build: require("./build")
build: require("./_build")
});

View File

@@ -1,120 +0,0 @@
/*
* You probably want to just delete this file; it's just for the demo pages.
*/
.remix-app {
display: flex;
flex-direction: column;
min-height: 100vh;
min-height: calc(100vh - env(safe-area-inset-bottom));
}
.remix-app > * {
width: 100%;
}
.remix-app__header {
padding-top: 1rem;
padding-bottom: 1rem;
border-bottom: 1px solid var(--color-border);
}
.remix-app__header-content {
display: flex;
justify-content: space-between;
align-items: center;
}
.remix-app__header-home-link {
width: 106px;
height: 30px;
color: var(--color-foreground);
}
.remix-app__header-nav ul {
list-style: none;
margin: 0;
display: flex;
align-items: center;
gap: 1.5em;
}
.remix-app__header-nav li {
font-weight: bold;
}
.remix-app__main {
flex: 1 1 100%;
}
.remix-app__footer {
padding-top: 1rem;
padding-bottom: 1rem;
border-top: 1px solid var(--color-border);
}
.remix-app__footer-content {
display: flex;
justify-content: center;
align-items: center;
}
.remix__page {
--gap: 1rem;
--space: 2rem;
display: grid;
grid-auto-rows: min-content;
gap: var(--gap);
padding-top: var(--space);
padding-bottom: var(--space);
}
@media print, screen and (min-width: 640px) {
.remix__page {
--gap: 2rem;
grid-auto-rows: unset;
grid-template-columns: repeat(2, 1fr);
}
}
@media screen and (min-width: 1024px) {
.remix__page {
--gap: 4rem;
}
}
.remix__page > main > :first-child {
margin-top: 0;
}
.remix__page > main > :last-child {
margin-bottom: 0;
}
.remix__page > aside {
margin: 0;
padding: 1.5ch 2ch;
border: solid 1px var(--color-border);
border-radius: 0.5rem;
}
.remix__page > aside > :first-child {
margin-top: 0;
}
.remix__page > aside > :last-child {
margin-bottom: 0;
}
.remix__form {
display: flex;
flex-direction: column;
gap: 1rem;
padding: 1rem;
border: 1px solid var(--color-border);
border-radius: 0.5rem;
}
.remix__form > * {
margin-top: 0;
margin-bottom: 0;
}

8345
examples/remix/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -9,15 +9,15 @@
"postinstall": "remix setup node"
},
"dependencies": {
"@remix-run/react": "^1.0.5",
"@remix-run/react": "^1.0.6",
"react": "^17.0.2",
"react-dom": "^17.0.2",
"remix": "^1.0.5",
"@remix-run/serve": "^1.0.5",
"@remix-run/vercel": "^1.0.5"
"remix": "^1.0.6",
"@remix-run/serve": "^1.0.6",
"@remix-run/vercel": "^1.0.6"
},
"devDependencies": {
"@remix-run/dev": "^1.0.5",
"@remix-run/dev": "^1.0.6",
"@types/react": "^17.0.24",
"@types/react-dom": "^17.0.9",
"typescript": "^4.1.2"

View File

@@ -5,5 +5,5 @@ module.exports = {
appDirectory: "app",
browserBuildDirectory: "public/build",
publicPath: "/build/",
serverBuildDirectory: "api/build"
serverBuildDirectory: "api/_build"
};

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/build-utils",
"version": "2.12.3-canary.25",
"version": "2.12.3-canary.34",
"license": "MIT",
"main": "./dist/index.js",
"types": "./dist/index.d.js",
@@ -30,7 +30,7 @@
"@types/node-fetch": "^2.1.6",
"@types/semver": "6.0.0",
"@types/yazl": "^2.4.1",
"@vercel/frameworks": "0.5.1-canary.15",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/ncc": "0.24.0",
"aggregate-error": "3.0.1",
"async-retry": "1.2.3",

View File

@@ -1,93 +1,281 @@
import fs from 'fs-extra';
import { join, dirname, relative } from 'path';
import { join, dirname, parse, relative } from 'path';
import glob from './fs/glob';
import { normalizePath } from './fs/normalize-path';
import { FILES_SYMBOL, getLambdaOptionsFromFunction, Lambda } from './lambda';
import { FILES_SYMBOL, Lambda } from './lambda';
import type FileBlob from './file-blob';
import type { BuilderFunctions, BuildOptions, Files } from './types';
import minimatch from 'minimatch';
import type { BuildOptions, Files } from './types';
import { debug, getIgnoreFilter } from '.';
// `.output` was already created by the Build Command, so we have
// to ensure its contents don't get bundled into the Lambda. Similarily,
// we don't want to bundle anything from `.vercel` either. Lastly,
// Builders/Runtimes didn't have `vercel.json` or `now.json`.
const ignoredPaths = ['.output', '.vercel', 'vercel.json', 'now.json'];
const shouldIgnorePath = (
file: string,
ignoreFilter: any,
ignoreFile: boolean
) => {
const isNative = ignoredPaths.some(item => {
return file.startsWith(item);
});
if (!ignoreFile) {
return isNative;
}
return isNative || ignoreFilter(file);
};
const getSourceFiles = async (workPath: string, ignoreFilter: any) => {
const list = await glob('**', {
cwd: workPath,
});
// We're not passing this as an `ignore` filter to the `glob` function above,
// so that we can re-use exactly the same `getIgnoreFilter` method that the
// Build Step uses (literally the same code). Note that this exclusion only applies
// when deploying. Locally, another exclusion is needed, which is handled
// further below in the `convertRuntimeToPlugin` function.
for (const file in list) {
if (shouldIgnorePath(file, ignoreFilter, true)) {
delete list[file];
}
}
return list;
};
/**
* Convert legacy Runtime to a Plugin.
* @param buildRuntime - a legacy build() function from a Runtime
* @param packageName - the name of the package, for example `vercel-plugin-python`
* @param ext - the file extension, for example `.py`
*/
export function convertRuntimeToPlugin(
buildRuntime: (options: BuildOptions) => Promise<{ output: Lambda }>,
packageName: string,
ext: string
) {
// This `build()` signature should match `plugin.build()` signature in `vercel build`.
return async function build({
vercelConfig,
workPath,
}: {
vercelConfig: {
functions?: BuilderFunctions;
regions?: string[];
};
workPath: string;
}) {
const opts = { cwd: workPath };
const files = await glob('**', opts);
delete files['vercel.json']; // Builders/Runtimes didn't have vercel.json
const entrypoints = await glob(`api/**/*${ext}`, opts);
return async function build({ workPath }: { workPath: string }) {
// We also don't want to provide any files to Runtimes that were ignored
// through `.vercelignore` or `.nowignore`, because the Build Step does the same.
const ignoreFilter = await getIgnoreFilter(workPath);
// Retrieve the files that are currently available on the File System,
// before the Legacy Runtime has even started to build.
const sourceFilesPreBuild = await getSourceFiles(workPath, ignoreFilter);
// Instead of doing another `glob` to get all the matching source files,
// we'll filter the list of existing files down to only the ones
// that are matching the entrypoint pattern, so we're first creating
// a clean new list to begin.
const entrypoints = Object.assign({}, sourceFilesPreBuild);
const entrypointMatch = new RegExp(`^api/.*${ext}$`);
// Up next, we'll strip out the files from the list of entrypoints
// that aren't actually considered entrypoints.
for (const file in entrypoints) {
if (!entrypointMatch.test(file)) {
delete entrypoints[file];
}
}
const pages: { [key: string]: any } = {};
const { functions = {} } = vercelConfig;
const traceDir = join(workPath, '.output', 'runtime-traced-files');
const pluginName = packageName.replace('vercel-plugin-', '');
const traceDir = join(
workPath,
`.output`,
`inputs`,
// Legacy Runtimes can only provide API Routes, so that's
// why we can use this prefix for all of them. Here, we have to
// make sure to not use a cryptic hash name, because people
// need to be able to easily inspect the output.
`api-routes-${pluginName}`
);
await fs.ensureDir(traceDir);
for (const entrypoint of Object.keys(entrypoints)) {
const key =
Object.keys(functions).find(
src => src === entrypoint || minimatch(entrypoint, src)
) || '';
const config = functions[key] || {};
let newPathsRuntime: Set<string> = new Set();
let linkersRuntime: Array<Promise<void>> = [];
for (const entrypoint of Object.keys(entrypoints)) {
const { output } = await buildRuntime({
files,
files: sourceFilesPreBuild,
entrypoint,
workPath,
config: {
zeroConfig: true,
includeFiles: config.includeFiles,
excludeFiles: config.excludeFiles,
},
meta: {
avoidTopLevelInstall: true,
},
});
pages[entrypoint] = {
handler: output.handler,
runtime: output.runtime,
memory: output.memory,
maxDuration: output.maxDuration,
environment: output.environment,
allowQuery: output.allowQuery,
//regions: output.regions,
};
// Legacy Runtimes tend to pollute the `workPath` with compiled results,
// because the `workPath` used to be a place that was a place where they could
// just put anything, but nowadays it's the working directory of the `vercel build`
// command, which is the place where the developer keeps their source files,
// so we don't want to pollute this space unnecessarily. That means we have to clean
// up files that were created by the build, which is done further below.
const sourceFilesAfterBuild = await getSourceFiles(
workPath,
ignoreFilter
);
// Further down, we will need the filename of the Lambda handler
// for placing it inside `server/pages/api`, but because Legacy Runtimes
// don't expose the filename directly, we have to construct it
// from the handler name, and then find the matching file further below,
// because we don't yet know its extension here.
const handler = output.handler;
const handlerMethod = handler.split('.').reverse()[0];
const handlerFileName = handler.replace(`.${handlerMethod}`, '');
// @ts-ignore This symbol is a private API
const lambdaFiles: Files = output[FILES_SYMBOL];
// When deploying, the `files` that are passed to the Legacy Runtimes already
// have certain files that are ignored stripped, but locally, that list of
// files isn't used by the Legacy Runtimes, so we need to apply the filters
// to the outputs that they are returning instead.
for (const file in lambdaFiles) {
if (shouldIgnorePath(file, ignoreFilter, false)) {
delete lambdaFiles[file];
}
}
const handlerFilePath = Object.keys(lambdaFiles).find(item => {
return parse(item).name === handlerFileName;
});
const handlerFileOrigin = lambdaFiles[handlerFilePath || ''].fsPath;
if (!handlerFileOrigin) {
throw new Error(
`Could not find a handler file. Please ensure that the list of \`files\` defined for the returned \`Lambda\` contains a file with the name ${handlerFileName} (+ any extension).`
);
}
const entry = join(workPath, '.output', 'server', 'pages', entrypoint);
// We never want to link here, only copy, because the launcher
// file often has the same name for every entrypoint, which means that
// every build for every entrypoint overwrites the launcher of the previous
// one, so linking would end with a broken reference.
await fs.ensureDir(dirname(entry));
await linkOrCopy(files[entrypoint].fsPath, entry);
await fs.copy(handlerFileOrigin, entry);
const newFilesEntrypoint: Array<string> = [];
const newDirectoriesEntrypoint: Array<string> = [];
const preBuildFiles = Object.values(sourceFilesPreBuild).map(file => {
return file.fsPath;
});
// Generate a list of directories and files that weren't present
// before the entrypoint was processed by the Legacy Runtime, so
// that we can perform a cleanup later. We need to divide into files
// and directories because only cleaning up files might leave empty
// directories, and listing directories separately also speeds up the
// build because we can just delete them, which wipes all of their nested
// paths, instead of iterating through all files that should be deleted.
for (const file in sourceFilesAfterBuild) {
if (!sourceFilesPreBuild[file]) {
const path = sourceFilesAfterBuild[file].fsPath;
const dirPath = dirname(path);
// If none of the files that were present before the entrypoint
// was processed are contained within the directory we're looking
// at right now, then we know it's a newly added directory
// and it can therefore be removed later on.
const isNewDir = !preBuildFiles.some(filePath => {
return dirname(filePath).startsWith(dirPath);
});
// Check out the list of tracked directories that were
// newly added and see if one of them contains the path
// we're looking at.
const hasParentDir = newDirectoriesEntrypoint.some(dir => {
return path.startsWith(dir);
});
// If we have already tracked a directory that was newly
// added that sits above the file or directory that we're
// looking at, we don't need to add more entries to the list
// because when the parent will get removed in the future,
// all of its children (and therefore the path we're looking at)
// will automatically get removed anyways.
if (hasParentDir) {
continue;
}
if (isNewDir) {
newDirectoriesEntrypoint.push(dirPath);
} else {
newFilesEntrypoint.push(path);
}
}
}
const tracedFiles: {
absolutePath: string;
relativePath: string;
}[] = [];
Object.entries(lambdaFiles).forEach(async ([relPath, file]) => {
const newPath = join(traceDir, relPath);
tracedFiles.push({ absolutePath: newPath, relativePath: relPath });
if (file.fsPath) {
await linkOrCopy(file.fsPath, newPath);
} else if (file.type === 'FileBlob') {
const { data, mode } = file as FileBlob;
await fs.writeFile(newPath, data, { mode });
} else {
throw new Error(`Unknown file type: ${file.type}`);
const linkers = Object.entries(lambdaFiles).map(
async ([relPath, file]) => {
const newPath = join(traceDir, relPath);
// The handler was already moved into position above.
if (relPath === handlerFilePath) {
return;
}
tracedFiles.push({ absolutePath: newPath, relativePath: relPath });
const { fsPath, type } = file;
if (fsPath) {
await fs.ensureDir(dirname(newPath));
const isNewFile = newFilesEntrypoint.includes(fsPath);
const isInsideNewDirectory = newDirectoriesEntrypoint.some(
dirPath => {
return fsPath.startsWith(dirPath);
}
);
// With this, we're making sure that files in the `workPath` that existed
// before the Legacy Runtime was invoked (source files) are linked from
// `.output` instead of copying there (the latter only happens if linking fails),
// which is the fastest solution. However, files that are created fresh
// by the Legacy Runtimes are always copied, because their link destinations
// are likely to be overwritten every time an entrypoint is processed by
// the Legacy Runtime. This is likely to overwrite the destination on subsequent
// runs, but that's also how `workPath` used to work originally, without
// the File System API (meaning that there was one `workPath` for all entrypoints).
if (isNewFile || isInsideNewDirectory) {
debug(`Copying from ${fsPath} to ${newPath}`);
await fs.copy(fsPath, newPath);
} else {
await linkOrCopy(fsPath, newPath);
}
} else if (type === 'FileBlob') {
const { data, mode } = file as FileBlob;
await fs.writeFile(newPath, data, { mode });
} else {
throw new Error(`Unknown file type: ${type}`);
}
}
});
);
linkersRuntime = linkersRuntime.concat(linkers);
const nft = join(
workPath,
@@ -96,19 +284,64 @@ export function convertRuntimeToPlugin(
'pages',
`${entrypoint}.nft.json`
);
const json = JSON.stringify({
version: 1,
files: tracedFiles.map(f => ({
input: normalizePath(relative(nft, f.absolutePath)),
output: normalizePath(f.relativePath),
files: tracedFiles.map(file => ({
input: normalizePath(relative(dirname(nft), file.absolutePath)),
output: normalizePath(file.relativePath),
})),
});
await fs.ensureDir(dirname(nft));
await fs.writeFile(nft, json);
// Extend the list of directories and files that were created by the
// Legacy Runtime with the list of directories and files that were
// created for the entrypoint that was just processed above.
newPathsRuntime = new Set([
...newPathsRuntime,
...newFilesEntrypoint,
...newDirectoriesEntrypoint,
]);
const apiRouteHandler = `${parse(entry).name}.${handlerMethod}`;
// Add an entry that will later on be added to the `functions-manifest.json`
// file that is placed inside of the `.output` directory.
pages[entrypoint] = {
handler: apiRouteHandler,
runtime: output.runtime,
memory: output.memory,
maxDuration: output.maxDuration,
environment: output.environment,
allowQuery: output.allowQuery,
};
}
await updateFunctionsManifest({ vercelConfig, workPath, pages });
// Instead of of waiting for all of the linking to be done for every
// entrypoint before processing the next one, we immediately handle all
// of them one after the other, while then waiting for the linking
// to finish right here, before we clean up newly created files below.
await Promise.all(linkersRuntime);
// A list of all the files that were created by the Legacy Runtime,
// which we'd like to remove from the File System.
const toRemove = Array.from(newPathsRuntime).map(path => {
debug(`Removing ${path} as part of cleanup`);
return fs.remove(path);
});
// Once all the entrypoints have been processed, we'd like to
// remove all the files from `workPath` that originally weren't present
// before the Legacy Runtime began running, because the `workPath`
// is nowadays the directory in which the user keeps their source code, since
// we're no longer running separate parallel builds for every Legacy Runtime.
await Promise.all(toRemove);
// Add any Serverless Functions that were exposed by the Legacy Runtime
// to the `functions-manifest.json` file provided in `.output`.
await updateFunctionsManifest({ workPath, pages });
};
}
@@ -136,15 +369,12 @@ async function readJson(filePath: string): Promise<{ [key: string]: any }> {
/**
* If `.output/functions-manifest.json` exists, append to the pages
* property. Otherwise write a new file. This will also read `vercel.json`
* and apply relevant `functions` property config.
* property. Otherwise write a new file.
*/
export async function updateFunctionsManifest({
vercelConfig,
workPath,
pages,
}: {
vercelConfig: { functions?: BuilderFunctions; regions?: string[] };
workPath: string;
pages: { [key: string]: any };
}) {
@@ -159,16 +389,7 @@ export async function updateFunctionsManifest({
if (!functionsManifest.pages) functionsManifest.pages = {};
for (const [pageKey, pageConfig] of Object.entries(pages)) {
const fnConfig = await getLambdaOptionsFromFunction({
sourceFile: pageKey,
config: vercelConfig,
});
functionsManifest.pages[pageKey] = {
...pageConfig,
memory: fnConfig.memory || pageConfig.memory,
maxDuration: fnConfig.maxDuration || pageConfig.maxDuration,
regions: vercelConfig.regions || pageConfig.regions,
};
functionsManifest.pages[pageKey] = { ...pageConfig };
}
await fs.writeFile(functionsManifestPath, JSON.stringify(functionsManifest));

View File

@@ -0,0 +1,84 @@
import path from 'path';
import fs from 'fs-extra';
import ignore from 'ignore';
interface CodedError extends Error {
code: string;
}
function isCodedError(error: unknown): error is CodedError {
return (
error !== null &&
error !== undefined &&
(error as CodedError).code !== undefined
);
}
function clearRelative(s: string) {
return s.replace(/(\n|^)\.\//g, '$1');
}
export default async function (
downloadPath: string,
rootDirectory?: string | undefined
) {
const readFile = async (p: string) => {
try {
return await fs.readFile(p, 'utf8');
} catch (error: any) {
if (
error.code === 'ENOENT' ||
(error instanceof Error && error.message.includes('ENOENT'))
) {
return undefined;
}
throw error;
}
};
const vercelIgnorePath = path.join(
downloadPath,
rootDirectory || '',
'.vercelignore'
);
const nowIgnorePath = path.join(
downloadPath,
rootDirectory || '',
'.nowignore'
);
const ignoreContents = [];
try {
ignoreContents.push(
...(
await Promise.all([readFile(vercelIgnorePath), readFile(nowIgnorePath)])
).filter(Boolean)
);
} catch (error) {
if (isCodedError(error) && error.code === 'ENOTDIR') {
console.log(`Warning: Cannot read ignore file from ${vercelIgnorePath}`);
} else {
throw error;
}
}
if (ignoreContents.length === 2) {
throw new Error(
'Cannot use both a `.vercelignore` and `.nowignore` file. Please delete the `.nowignore` file.'
);
}
if (ignoreContents.length === 0) {
return () => false;
}
const ignoreFilter: any = ignore().add(clearRelative(ignoreContents[0]!));
return function (p: string) {
// we should not ignore now.json and vercel.json if it asked to.
// we depend on these files for building the app with sourceless
if (p === 'now.json' || p === 'vercel.json') return false;
return ignoreFilter.test(p).ignored;
};
}

View File

@@ -1,3 +1,4 @@
import { createHash } from 'crypto';
import FileBlob from './file-blob';
import FileFsRef from './file-fs-ref';
import FileRef from './file-ref';
@@ -33,6 +34,7 @@ import { NowBuildError } from './errors';
import streamToBuffer from './fs/stream-to-buffer';
import shouldServe from './should-serve';
import debug from './debug';
import getIgnoreFilter from './get-ignore-filter';
export {
FileBlob,
@@ -70,6 +72,7 @@ export {
isSymbolicLink,
getLambdaOptionsFromFunction,
scanParentDirs,
getIgnoreFilter,
};
export {
@@ -132,3 +135,11 @@ export const getPlatformEnv = (name: string): string | undefined => {
}
return n;
};
/**
* Helper function for generating file or directories names in `.output/inputs`
* for dependencies of files provided to the File System API.
*/
export const getInputHash = (source: Buffer | string): string => {
return createHash('sha1').update(source).digest('hex');
};

View File

@@ -58,6 +58,7 @@ export interface Meta {
filesRemoved?: string[];
env?: Env;
buildEnv?: Env;
avoidTopLevelInstall?: boolean;
}
export interface AnalyzeOptions {

View File

@@ -1,6 +1,6 @@
import { join } from 'path';
import fs from 'fs-extra';
import { BuildOptions, createLambda } from '../src';
import { BuildOptions, createLambda, FileFsRef } from '../src';
import { convertRuntimeToPlugin } from '../src/convert-runtime-to-plugin';
async function fsToJson(dir: string, output: Record<string, any> = {}) {
@@ -32,9 +32,13 @@ describe('convert-runtime-to-plugin', () => {
});
it('should create correct fileystem for python', async () => {
const ext = '.py';
const workPath = pythonApiWorkpath;
const handlerName = 'vc__handler__python';
const handlerFileName = handlerName + ext;
const lambdaOptions = {
handler: 'index.handler',
handler: `${handlerName}.vc_handler`,
runtime: 'python3.9',
memory: 512,
maxDuration: 5,
@@ -42,6 +46,15 @@ describe('convert-runtime-to-plugin', () => {
};
const buildRuntime = async (opts: BuildOptions) => {
const handlerPath = join(workPath, handlerFileName);
// This is the usual time at which a Legacy Runtime writes its Lambda launcher.
await fs.writeFile(handlerPath, '# handler');
opts.files[handlerFileName] = new FileFsRef({
fsPath: handlerPath,
});
const lambda = await createLambda({
files: opts.files,
...lambdaOptions,
@@ -50,25 +63,30 @@ describe('convert-runtime-to-plugin', () => {
};
const lambdaFiles = await fsToJson(workPath);
const vercelConfig = JSON.parse(lambdaFiles['vercel.json']);
delete lambdaFiles['vercel.json'];
const build = await convertRuntimeToPlugin(buildRuntime, '.py');
const packageName = 'vercel-plugin-python';
const build = await convertRuntimeToPlugin(buildRuntime, packageName, ext);
await build({ vercelConfig, workPath });
await build({ workPath });
const output = await fsToJson(join(workPath, '.output'));
delete lambdaFiles['vercel.json'];
delete lambdaFiles['vc__handler__python.py'];
expect(output).toMatchObject({
'functions-manifest.json': expect.stringContaining('{'),
'runtime-traced-files': lambdaFiles,
inputs: {
'api-routes-python': lambdaFiles,
},
server: {
pages: {
api: {
'index.py': expect.stringContaining('index'),
'index.py': expect.stringContaining('handler'),
'index.py.nft.json': expect.stringContaining('{'),
users: {
'get.py': expect.stringContaining('get'),
'get.py': expect.stringContaining('handler'),
'get.py.nft.json': expect.stringContaining('{'),
'post.py': expect.stringContaining('post'),
'post.py': expect.stringContaining('handler'),
'post.py.nft.json': expect.stringContaining('{'),
},
},
@@ -80,9 +98,13 @@ describe('convert-runtime-to-plugin', () => {
expect(funcManifest).toMatchObject({
version: 1,
pages: {
'api/index.py': lambdaOptions,
'api/users/get.py': lambdaOptions,
'api/users/post.py': { ...lambdaOptions, memory: 3008 },
'api/index.py': { ...lambdaOptions, handler: 'index.vc_handler' },
'api/users/get.py': { ...lambdaOptions, handler: 'get.vc_handler' },
'api/users/post.py': {
...lambdaOptions,
handler: 'post.vc_handler',
memory: 512,
},
},
});
@@ -91,36 +113,35 @@ describe('convert-runtime-to-plugin', () => {
version: 1,
files: [
{
input: '../../../../runtime-traced-files/api/db/[id].py',
input: `../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: '../../../../runtime-traced-files/api/index.py',
input: `../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input:
'../../../../runtime-traced-files/api/project/[aid]/[bid]/index.py',
input: `../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: '../../../../runtime-traced-files/api/users/get.py',
input: `../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: '../../../../runtime-traced-files/api/users/post.py',
input: `../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: '../../../../runtime-traced-files/file.txt',
input: `../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: '../../../../runtime-traced-files/util/date.py',
input: `../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: '../../../../runtime-traced-files/util/math.py',
input: `../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
],
@@ -133,36 +154,35 @@ describe('convert-runtime-to-plugin', () => {
version: 1,
files: [
{
input: '../../../../../runtime-traced-files/api/db/[id].py',
input: `../../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: '../../../../../runtime-traced-files/api/index.py',
input: `../../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input:
'../../../../../runtime-traced-files/api/project/[aid]/[bid]/index.py',
input: `../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: '../../../../../runtime-traced-files/api/users/get.py',
input: `../../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: '../../../../../runtime-traced-files/api/users/post.py',
input: `../../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: '../../../../../runtime-traced-files/file.txt',
input: `../../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: '../../../../../runtime-traced-files/util/date.py',
input: `../../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: '../../../../../runtime-traced-files/util/math.py',
input: `../../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
],
@@ -175,36 +195,35 @@ describe('convert-runtime-to-plugin', () => {
version: 1,
files: [
{
input: '../../../../../runtime-traced-files/api/db/[id].py',
input: `../../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: '../../../../../runtime-traced-files/api/index.py',
input: `../../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input:
'../../../../../runtime-traced-files/api/project/[aid]/[bid]/index.py',
input: `../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: '../../../../../runtime-traced-files/api/users/get.py',
input: `../../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: '../../../../../runtime-traced-files/api/users/post.py',
input: `../../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: '../../../../../runtime-traced-files/file.txt',
input: `../../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: '../../../../../runtime-traced-files/util/date.py',
input: `../../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: '../../../../../runtime-traced-files/util/math.py',
input: `../../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
],

View File

@@ -1,6 +1,6 @@
{
"name": "vercel",
"version": "23.1.3-canary.43",
"version": "23.1.3-canary.56",
"preferGlobal": true,
"license": "Apache-2.0",
"description": "The command-line interface for Vercel",
@@ -43,14 +43,14 @@
"node": ">= 12"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.25",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/go": "1.2.4-canary.4",
"@vercel/node": "1.12.2-canary.7",
"@vercel/python": "2.1.2-canary.0",
"@vercel/ruby": "1.2.8-canary.4",
"@vercel/python": "2.1.2-canary.1",
"@vercel/ruby": "1.2.8-canary.6",
"update-notifier": "4.1.0",
"vercel-plugin-middleware": "0.0.0-canary.7",
"vercel-plugin-node": "1.12.2-canary.16"
"vercel-plugin-middleware": "0.0.0-canary.10",
"vercel-plugin-node": "1.12.2-canary.26"
},
"devDependencies": {
"@next/env": "11.1.2",
@@ -90,7 +90,7 @@
"@types/update-notifier": "5.1.0",
"@types/which": "1.3.2",
"@types/write-json-file": "2.2.1",
"@vercel/frameworks": "0.5.1-canary.15",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.17.0",
"@zeit/fun": "0.11.2",

View File

@@ -5,6 +5,7 @@ import {
GlobOptions,
scanParentDirs,
spawnAsync,
glob as buildUtilsGlob,
} from '@vercel/build-utils';
import { nodeFileTrace } from '@vercel/nft';
import Sema from 'async-sema';
@@ -298,6 +299,9 @@ export default async function main(client: Client) {
}
}
// Required for Next.js to produce the correct `.nft.json` files.
spawnOpts.env.NEXT_PRIVATE_OUTPUT_TRACE_ROOT = baseDir;
// Yarn v2 PnP mode may be activated, so force
// "node-modules" linker style
const env = {
@@ -352,8 +356,13 @@ export default async function main(client: Client) {
// We cannot rely on the `framework` alone, as it might be a static export,
// and the current build might use a differnt project that's not in the settings.
const isNextOutput = Boolean(dotNextDir);
const outputDir = isNextOutput ? OUTPUT_DIR : join(OUTPUT_DIR, 'static');
const nextExport = await getNextExportStatus(dotNextDir);
const outputDir =
isNextOutput && !nextExport ? OUTPUT_DIR : join(OUTPUT_DIR, 'static');
const distDir =
(nextExport?.exportDetail.outDirectory
? relative(cwd, nextExport.exportDetail.outDirectory)
: false) ||
dotNextDir ||
userOutputDirectory ||
(await framework.getFsOutputDir(cwd));
@@ -440,7 +449,53 @@ export default async function main(client: Client) {
}
// Special Next.js processing.
if (isNextOutput) {
if (nextExport) {
client.output.debug('Found `next export` output.');
const htmlFiles = await buildUtilsGlob(
'**/*.html',
join(cwd, OUTPUT_DIR, 'static')
);
if (nextExport.exportDetail.success !== true) {
client.output.error(
`Export of Next.js app failed. Please check your build logs.`
);
process.exit(1);
}
await fs.mkdirp(join(cwd, OUTPUT_DIR, 'server', 'pages'));
await fs.mkdirp(join(cwd, OUTPUT_DIR, 'static'));
await Promise.all(
Object.keys(htmlFiles).map(async fileName => {
await sema.acquire();
const input = join(cwd, OUTPUT_DIR, 'static', fileName);
const target = join(cwd, OUTPUT_DIR, 'server', 'pages', fileName);
await fs.mkdirp(dirname(target));
await fs.promises.rename(input, target).finally(() => {
sema.release();
});
})
);
for (const file of [
'BUILD_ID',
'images-manifest.json',
'routes-manifest.json',
'build-manifest.json',
]) {
const input = join(nextExport.dotNextDir, file);
if (fs.existsSync(input)) {
// Do not use `smartCopy`, since we want to overwrite if they already exist.
await fs.copyFile(input, join(OUTPUT_DIR, file));
}
}
} else if (isNextOutput) {
// The contents of `.output/static` should be placed inside of `.output/static/_next/static`
const tempStatic = '___static';
await fs.rename(
@@ -609,30 +664,37 @@ export default async function main(client: Client) {
}
}
client.output.debug(`Resolve ${param('required-server-files.json')}.`);
const requiredServerFilesPath = join(
OUTPUT_DIR,
'required-server-files.json'
);
const requiredServerFilesJson = await fs.readJSON(
requiredServerFilesPath
);
await fs.writeJSON(requiredServerFilesPath, {
...requiredServerFilesJson,
appDir: '.',
files: requiredServerFilesJson.files.map((i: string) => {
const originalPath = join(dirname(distDir), i);
const relPath = join(OUTPUT_DIR, relative(distDir, originalPath));
const absolutePath = join(cwd, relPath);
const output = relative(baseDir, absolutePath);
if (fs.existsSync(requiredServerFilesPath)) {
client.output.debug(`Resolve ${param('required-server-files.json')}.`);
return {
input: relPath,
output,
};
}),
});
const requiredServerFilesJson = await fs.readJSON(
requiredServerFilesPath
);
await fs.writeJSON(requiredServerFilesPath, {
...requiredServerFilesJson,
appDir: '.',
files: requiredServerFilesJson.files.map((i: string) => {
const originalPath = join(requiredServerFilesJson.appDir, i);
const relPath = join(OUTPUT_DIR, relative(distDir, originalPath));
const absolutePath = join(cwd, relPath);
const output = relative(baseDir, absolutePath);
return relPath === output
? relPath
: {
input: relPath,
output,
};
}),
});
}
}
}
@@ -867,3 +929,53 @@ async function resolveNftToOutput({
files: newFilesList,
});
}
/**
* Files will only exist when `next export` was used.
*/
async function getNextExportStatus(dotNextDir: string | null) {
if (!dotNextDir) {
return null;
}
const exportDetail: {
success: boolean;
outDirectory: string;
} | null = await fs
.readJson(join(dotNextDir, 'export-detail.json'))
.catch(error => {
if (error.code === 'ENOENT') {
return null;
}
throw error;
});
if (!exportDetail) {
return null;
}
const exportMarker: {
version: 1;
exportTrailingSlash: boolean;
hasExportPathMap: boolean;
} | null = await fs
.readJSON(join(dotNextDir, 'export-marker.json'))
.catch(error => {
if (error.code === 'ENOENT') {
return null;
}
throw error;
});
return {
dotNextDir,
exportDetail,
exportMarker: {
trailingSlash: exportMarker?.hasExportPathMap
? exportMarker.exportTrailingSlash
: false,
},
};
}

View File

@@ -968,7 +968,7 @@ export default class DevServer {
socket.destroy();
return;
}
const target = `http://localhost:${this.devProcessPort}`;
const target = `http://127.0.0.1:${this.devProcessPort}`;
this.output.debug(`Detected "upgrade" event, proxying to ${target}`);
this.proxy.ws(req, socket, head, { target });
});
@@ -1663,7 +1663,7 @@ export default class DevServer {
if (!match) {
// If the dev command is started, then proxy to it
if (this.devProcessPort) {
const upstream = `http://localhost:${this.devProcessPort}`;
const upstream = `http://127.0.0.1:${this.devProcessPort}`;
debug(`Proxying to frontend dev server: ${upstream}`);
// Add the Vercel platform proxy request headers
@@ -1810,7 +1810,7 @@ export default class DevServer {
return proxyPass(
req,
res,
`http://localhost:${port}`,
`http://127.0.0.1:${port}`,
this,
requestId,
false
@@ -1847,7 +1847,7 @@ export default class DevServer {
return proxyPass(
req,
res,
`http://localhost:${this.devProcessPort}`,
`http://127.0.0.1:${this.devProcessPort}`,
this,
requestId,
false

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/client",
"version": "10.2.3-canary.26",
"version": "10.2.3-canary.35",
"main": "dist/index.js",
"typings": "dist/index.d.ts",
"homepage": "https://vercel.com",
@@ -40,7 +40,7 @@
]
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.25",
"@vercel/build-utils": "2.12.3-canary.34",
"@zeit/fetch": "5.2.0",
"async-retry": "1.2.3",
"async-sema": "3.0.0",

View File

@@ -0,0 +1,6 @@
<svg viewBox="0 0 800 800" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M700 0H100C44.772 0 0 44.772 0 100v600c0 55.228 44.772 100 100 100h600c55.228 0 100-44.772 100-100V100C800 44.772 755.228 0 700 0Z" fill="#212121"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M587.947 527.768c4.254 54.65 4.254 80.268 4.254 108.232H465.756c0-6.091.109-11.663.219-17.313.342-17.564.699-35.88-2.147-72.868-3.761-54.152-27.08-66.185-69.957-66.185H195v-98.525h204.889c54.16 0 81.241-16.476 81.241-60.098 0-38.357-27.081-61.601-81.241-61.601H195V163h227.456C545.069 163 606 220.912 606 313.42c0 69.193-42.877 114.319-100.799 121.84 48.895 9.777 77.48 37.605 82.746 92.508Z" fill="#fff"/>
<path d="M195 636v-73.447h133.697c22.332 0 27.181 16.563 27.181 26.441V636H195Z" fill="#fff"/>
<path d="M194.5 636v.5h161.878v-47.506c0-5.006-1.226-11.734-5.315-17.224-4.108-5.515-11.059-9.717-22.366-9.717H194.5V636Z" stroke="#fff" stroke-opacity=".8"/>
</svg>

After

Width:  |  Height:  |  Size: 958 B

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/frameworks",
"version": "0.5.1-canary.15",
"version": "0.5.1-canary.16",
"main": "./dist/frameworks.js",
"types": "./dist/frameworks.d.ts",
"files": [

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-middleware",
"version": "0.0.0-canary.7",
"version": "0.0.0-canary.10",
"license": "MIT",
"main": "./dist/index",
"homepage": "",
@@ -30,6 +30,7 @@
"@types/node-fetch": "^2",
"@types/ua-parser-js": "0.7.36",
"@types/uuid": "8.3.1",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/ncc": "0.24.0",
"cookie": "0.4.1",
"formdata-node": "4.3.1",

View File

@@ -5,6 +5,7 @@ import { promises as fsp } from 'fs';
import { IncomingMessage, ServerResponse } from 'http';
import libGlob from 'glob';
import Proxy from 'http-proxy';
import { updateFunctionsManifest } from '@vercel/build-utils';
import { run } from './websandbox';
import type { FetchEventResult } from './websandbox/types';
@@ -73,26 +74,20 @@ export async function build({ workPath }: { workPath: string }) {
await fsp.unlink(entriesPath);
}
// Write middleware manifest
const middlewareManifest = {
version: 1,
sortedMiddleware: ['/'],
middleware: {
'/': {
env: [],
files: ['server/pages/_middleware.js'],
name: 'pages/_middleware',
page: '/',
regexp: '^/.*$',
},
},
const fileName = basename(middlewareFile);
const pages: { [key: string]: any } = {};
pages[fileName] = {
runtime: 'web',
env: [],
files: ['server/pages/_middleware.js'],
name: 'pages/_middleware',
page: '/',
regexp: '^/.*$',
sortingIndex: 1,
};
const middlewareManifestData = JSON.stringify(middlewareManifest, null, 2);
const middlewareManifestPath = join(
workPath,
'.output/server/middleware-manifest.json'
);
await fsp.writeFile(middlewareManifestPath, middlewareManifestData);
await updateFunctionsManifest({ workPath, pages });
}
const stringifyQuery = (req: IncomingMessage, query: ParsedUrlQuery) => {

View File

@@ -2,8 +2,8 @@
exports[`build() should build simple middleware 1`] = `
Object {
"middleware": Object {
"/": Object {
"pages": Object {
"_middleware.js": Object {
"env": Array [],
"files": Array [
"server/pages/_middleware.js",
@@ -11,11 +11,10 @@ Object {
"name": "pages/_middleware",
"page": "/",
"regexp": "^/.*$",
"runtime": "web",
"sortingIndex": 1,
},
},
"sortedMiddleware": Array [
"/",
],
"version": 1,
}
`;

View File

@@ -22,7 +22,7 @@ describe('build()', () => {
const middlewareManifest = JSON.parse(
await fsp.readFile(
join(fixture, '.output/server/middleware-manifest.json'),
join(fixture, '.output/functions-manifest.json'),
'utf8'
)
);

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-go",
"version": "1.0.0-canary.10",
"version": "1.0.0-canary.22",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,7 +17,7 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.25",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/go": "1.2.4-canary.4"
},
"devDependencies": {

View File

@@ -1,6 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as go from '@vercel/go';
export const build = convertRuntimeToPlugin(go.build, '.go');
export const build = convertRuntimeToPlugin(go.build, 'vercel-plugin-go', '.go');
export const startDevServer = go.startDevServer;

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-node",
"version": "1.12.2-canary.16",
"version": "1.12.2-canary.26",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/node-js",
@@ -34,12 +34,12 @@
"@types/node-fetch": "2",
"@types/test-listen": "1.1.0",
"@types/yazl": "2.4.2",
"@vercel/build-utils": "2.12.3-canary.25",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/fun": "1.0.3",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.14.0",
"@vercel/node-bridge": "2.1.1-canary.2",
"@vercel/static-config": "0.0.1-canary.0",
"@vercel/static-config": "0.0.1-canary.1",
"abort-controller": "3.0.0",
"content-type": "1.0.4",
"cookie": "0.4.0",

View File

@@ -40,6 +40,7 @@ import {
walkParentDirs,
normalizePath,
runPackageJsonScript,
getInputHash,
} from '@vercel/build-utils';
import { FromSchema } from 'json-schema-to-ts';
import { getConfig, BaseFunctionConfigSchema } from '@vercel/static-config';
@@ -47,8 +48,6 @@ import { AbortController } from 'abort-controller';
import { Register, register } from './typescript';
import { pageToRoute } from './router/page-to-route';
import { isDynamicRoute } from './router/is-dynamic';
import crypto from 'crypto';
import type { VercelConfig } from '@vercel/client';
export { shouldServe };
export {
@@ -380,13 +379,7 @@ function getAWSLambdaHandler(entrypoint: string, config: FunctionConfig) {
}
// TODO NATE: turn this into a `@vercel/plugin-utils` helper function?
export async function build({
vercelConfig,
workPath,
}: {
vercelConfig: VercelConfig;
workPath: string;
}) {
export async function build({ workPath }: { workPath: string }) {
const project = new Project();
const entrypoints = await glob('api/**/*.[jt]s', workPath);
const installedPaths = new Set<string>();
@@ -408,14 +401,13 @@ export async function build({
getConfig(project, absEntrypoint, FunctionConfigSchema) || {};
// No config exported means "node", but if there is a config
// and "runtime" is defined, but it is not "node" then don't
// and "use" is defined, but it is not "node" then don't
// compile this file.
if (config.runtime && config.runtime !== 'node') {
if (config.use && config.use !== 'node') {
continue;
}
await buildEntrypoint({
vercelConfig,
workPath,
entrypoint,
config,
@@ -425,23 +417,18 @@ export async function build({
}
export async function buildEntrypoint({
vercelConfig,
workPath,
entrypoint,
config,
installedPaths,
}: {
vercelConfig: VercelConfig;
workPath: string;
entrypoint: string;
config: FunctionConfig;
installedPaths?: Set<string>;
}) {
// Unique hash that will be used as directory name for `.output`.
const entrypointHash = crypto
.createHash('sha256')
.update(entrypoint)
.digest('hex');
const entrypointHash = 'api-routes-node-' + getInputHash(entrypoint);
const outputDirPath = join(workPath, '.output');
const { dir, name } = parsePath(entrypoint);
@@ -561,7 +548,7 @@ export async function buildEntrypoint({
runtime: nodeVersion.runtime,
},
};
await updateFunctionsManifest({ vercelConfig, workPath, pages });
await updateFunctionsManifest({ workPath, pages });
// Update the `routes-mainifest.json` file with the wildcard route
// when the entrypoint is dynamic (i.e. `/api/[id].ts`).

View File

@@ -143,16 +143,7 @@ function withFixture<T>(
await runNpmInstall(fixture);
}
let vercelConfig = {};
try {
vercelConfig = JSON.parse(
await fsp.readFile(path.join(fixture, 'vercel.json'), 'utf8')
);
} catch (e) {
// Consume error
}
await build({ vercelConfig, workPath: fixture });
await build({ workPath: fixture });
try {
return await t({ fixture, fetch });

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-python",
"version": "1.0.0-canary.11",
"version": "1.0.0-canary.23",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,8 +17,8 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.25",
"@vercel/python": "2.1.2-canary.0"
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/python": "2.1.2-canary.1"
},
"devDependencies": {
"@types/node": "*",

View File

@@ -1,6 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as python from '@vercel/python';
export const build = convertRuntimeToPlugin(python.build, '.py');
export const build = convertRuntimeToPlugin(python.build, 'vercel-plugin-python', '.py');
//export const startDevServer = python.startDevServer;

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-ruby",
"version": "1.0.0-canary.9",
"version": "1.0.0-canary.21",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,8 +17,8 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.25",
"@vercel/ruby": "1.2.8-canary.4"
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/ruby": "1.2.8-canary.6"
},
"devDependencies": {
"@types/node": "*",

View File

@@ -1,6 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as ruby from '@vercel/ruby';
export const build = convertRuntimeToPlugin(ruby.build, '.rb');
export const build = convertRuntimeToPlugin(ruby.build, 'vercel-plugin-ruby', '.rb');
//export const startDevServer = ruby.startDevServer;

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/python",
"version": "2.1.2-canary.0",
"version": "2.1.2-canary.1",
"main": "./dist/index.js",
"license": "MIT",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/python",

View File

@@ -1,3 +1,4 @@
import { relative, basename } from 'path';
import execa from 'execa';
import { Meta, debug } from '@vercel/build-utils';
@@ -135,6 +136,18 @@ export async function installRequirementsFile({
meta,
args = [],
}: InstallRequirementsFileArg) {
const fileAtRoot = relative(workPath, filePath) === basename(filePath);
// If the `requirements.txt` file is located in the Root Directory of the project and
// the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed its dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall && fileAtRoot) {
debug(
`Skipping requirements file installation, already installed by Install Command`
);
return;
}
if (
meta.isDev &&
(await areRequirementsInstalled(pythonPath, filePath, workPath))

View File

@@ -1,4 +1,4 @@
import { join, dirname } from 'path';
import { join, dirname, relative } from 'path';
import execa from 'execa';
import {
ensureDir,
@@ -85,10 +85,12 @@ export async function build({
}: BuildOptions) {
await download(files, workPath, meta);
const entrypointFsDirname = join(workPath, dirname(entrypoint));
const gemfileName = 'Gemfile';
const gemfilePath = await walkParentDirs({
base: workPath,
start: entrypointFsDirname,
filename: 'Gemfile',
filename: gemfileName,
});
const gemfileContents = gemfilePath
? await readFile(gemfilePath, 'utf8')
@@ -130,15 +132,24 @@ export async function build({
'did not find a vendor directory but found a Gemfile, bundling gems...'
);
// try installing. this won't work if native extesions are required.
// if that's the case, gems should be vendored locally before deploying.
try {
await bundleInstall(bundlerPath, bundleDir, gemfilePath);
} catch (err) {
debug(
'unable to build gems from Gemfile. vendor the gems locally with "bundle install --deployment" and retry.'
);
throw err;
const fileAtRoot = relative(workPath, gemfilePath) === gemfileName;
// If the `Gemfile` is located in the Root Directory of the project and
// the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed its dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall && fileAtRoot) {
debug('Skipping `bundle install` — already handled by Install Command');
} else {
// try installing. this won't work if native extesions are required.
// if that's the case, gems should be vendored locally before deploying.
try {
await bundleInstall(bundlerPath, bundleDir, gemfilePath);
} catch (err) {
debug(
'unable to build gems from Gemfile. vendor the gems locally with "bundle install --deployment" and retry.'
);
throw err;
}
}
}
} else {

View File

@@ -66,6 +66,23 @@ export async function installBundler(meta: Meta, gemfileContents: string) {
gemfileContents
);
// If the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed the dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall) {
debug(
`Skipping bundler installation, already installed by Install Command`
);
return {
gemHome,
rubyPath,
gemPath,
vendorPath,
runtime,
bundlerPath: join(gemHome, 'bin', 'bundler'),
};
}
debug('installing bundler...');
await execa(gemPath, ['install', 'bundler', '--no-document'], {
stdio: 'pipe',

View File

@@ -1,7 +1,7 @@
{
"name": "@vercel/ruby",
"author": "Nathan Cahill <nathan@nathancahill.com>",
"version": "1.2.8-canary.4",
"version": "1.2.8-canary.6",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/ruby",

View File

@@ -0,0 +1,2 @@
---
BUNDLE_PATH: "vendor/bundle"

View File

@@ -2,6 +2,6 @@
source "https://rubygems.org"
ruby "~> 2.5.0"
ruby "~> 2.7.0"
gem "cowsay", "~> 0.3.0"

View File

@@ -0,0 +1,16 @@
GEM
remote: https://rubygems.org/
specs:
cowsay (0.3.0)
PLATFORMS
x86_64-darwin-21
DEPENDENCIES
cowsay (~> 0.3.0)
RUBY VERSION
ruby 2.7.5p203
BUNDLED WITH
2.2.22

View File

@@ -1,6 +1,6 @@
{
"version": 2,
"builds": [{ "src": "index.rb", "use": "@vercel/ruby" }],
"build": { "env": { "RUBY_VERSION": "2.5.x" } },
"build": { "env": { "RUBY_VERSION": "2.7.x" } },
"probes": [{ "path": "/", "mustContain": "gem:RANDOMNESS_PLACEHOLDER" }]
}

View File

@@ -14,19 +14,17 @@ Gem::Specification.new do |s|
s.executables = ["cowsay".freeze]
s.files = ["bin/cowsay".freeze]
s.homepage = "https://github.com/moneydesktop/cowsay".freeze
s.rubygems_version = "3.0.3".freeze
s.rubygems_version = "3.2.22".freeze
s.summary = "ASCII art avatars emote your messages".freeze
s.installed_by_version = "3.0.3" if s.respond_to? :installed_by_version
s.installed_by_version = "3.2.22" if s.respond_to? :installed_by_version
if s.respond_to? :specification_version then
s.specification_version = 4
end
if Gem::Version.new(Gem::VERSION) >= Gem::Version.new('1.2.0') then
s.add_development_dependency(%q<rake>.freeze, [">= 0"])
else
s.add_dependency(%q<rake>.freeze, [">= 0"])
end
if s.respond_to? :add_runtime_dependency then
s.add_development_dependency(%q<rake>.freeze, [">= 0"])
else
s.add_dependency(%q<rake>.freeze, [">= 0"])
end

View File

@@ -0,0 +1,2 @@
---
BUNDLE_PATH: "vendor/bundle"

View File

@@ -1,7 +1,7 @@
{
"version": 2,
"builds": [{ "src": "project/index.rb", "use": "@vercel/ruby" }],
"build": { "env": { "RUBY_VERSION": "2.5.x" } },
"build": { "env": { "RUBY_VERSION": "2.7.x" } },
"probes": [
{ "path": "/project/", "mustContain": "gem:RANDOMNESS_PLACEHOLDER" }
]

View File

@@ -0,0 +1,2 @@
---
BUNDLE_PATH: "vendor/bundle"

View File

@@ -2,6 +2,6 @@
source "https://rubygems.org"
ruby "~> 2.5.0"
ruby "~> 2.7.0"
gem "cowsay", "~> 0.3.0"

View File

@@ -0,0 +1,16 @@
GEM
remote: https://rubygems.org/
specs:
cowsay (0.3.0)
PLATFORMS
x86_64-darwin-21
DEPENDENCIES
cowsay (~> 0.3.0)
RUBY VERSION
ruby 2.7.5p203
BUNDLED WITH
2.2.22

Some files were not shown because too many files have changed in this diff Show More