Compare commits

..

25 Commits

Author SHA1 Message Date
Steven
01ea7b4b2a Publish Canary
- @vercel/build-utils@2.12.3-canary.43
 - vercel@23.1.3-canary.68
 - @vercel/client@10.2.3-canary.46
 - vercel-plugin-middleware@0.0.0-canary.20
 - vercel-plugin-go@1.0.0-canary.31
 - vercel-plugin-node@1.12.2-canary.35
 - vercel-plugin-python@1.0.0-canary.32
 - vercel-plugin-ruby@1.0.0-canary.31
2021-12-09 19:56:30 -05:00
Steven
475a227ba9 [build-utils][cli] Detect file system api usage, abort on "Exclusion Conditions" (#7158) 2021-12-09 19:54:39 -05:00
Nathan Rajlich
40ca92f2e9 [client] Use a Set to prevent duplicates in hashes() function (#7159)
Follow-up to #7150 to use a Set instead of assuming the conflicting entry is at index 0, which is not always the case.
2021-12-09 03:10:48 +00:00
Nathan Rajlich
465129e62e [cli] Remove support for single file deployments (#6652)
Deploying a single file has printed a deprecation warning for a long time. Let's finally remove that behavior.
2021-12-08 19:45:06 +00:00
Steven
bf0d5a7f29 [cli] Add @vercel/client as a devDependency (#7154) 2021-12-08 10:17:20 -05:00
Leo Lamprecht
d3ef240f6e Publish Canary
- @vercel/build-utils@2.12.3-canary.42
 - vercel@23.1.3-canary.67
 - @vercel/client@10.2.3-canary.45
 - vercel-plugin-middleware@0.0.0-canary.19
 - vercel-plugin-go@1.0.0-canary.30
 - vercel-plugin-node@1.12.2-canary.34
 - vercel-plugin-python@1.0.0-canary.31
 - vercel-plugin-ruby@1.0.0-canary.30
 - @vercel/python@2.1.2-canary.2
2021-12-08 15:53:14 +01:00
Leo Lamprecht
5b26ebc7b8 Make Python CLI Plugin work (#7155) 2021-12-08 15:52:43 +01:00
Leo Lamprecht
3427ad6ce0 Publish Canary
- vercel@23.1.3-canary.66
2021-12-08 12:50:58 +01:00
Leo Lamprecht
4ab5e4326b Improved Vercel CLI link (#7151) 2021-12-08 12:50:27 +01:00
Leo Lamprecht
d24a3ce3ab Publish Canary
- @vercel/client@10.2.3-canary.44
2021-12-08 12:10:44 +01:00
Steven
29a44db8d9 [client] Fix duplicate files when analyzing nft.json (#7150)
This PR fixes a regression from #7144 where a duplicate file was added if `nft.json` referenced an existing file.

I also refactored the prepared files logic to avoid cloning the array in every loop iteration.

Review [without whitespace](https://github.com/vercel/vercel/pull/7150/files?diff=split&w=1) will make it easier to understand.

- Related to https://github.com/vercel/runtimes/issues/304
2021-12-08 01:40:49 +00:00
Steven
695f3a9212 Publish Canary
- vercel@23.1.3-canary.65
 - @vercel/client@10.2.3-canary.43
 - vercel-plugin-middleware@0.0.0-canary.18
2021-12-07 18:25:02 -05:00
Steven
3ff777b8ed [client] Resolve .nft.json files when vc deploy --prebuilt (#7144)
This ensures that using `vc deploy --prebuilt` will also upload any files that `.output/**/.nft.json` points to and also handle the Root Directory correctly since `vc build` emits `rootdir/.output`.

- Related to https://github.com/vercel/runtimes/issues/304.
2021-12-07 18:17:58 -05:00
Tommaso De Rossi
d94b9806ab [middleware] Define env vars when building _middleware.js with esbuild (#7087)
### Related Issues

> Fixes #7086
> Related to #7086

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2021-12-07 22:16:46 +00:00
Leo Lamprecht
35c8fc2729 Publish Canary
- @vercel/build-utils@2.12.3-canary.41
 - vercel@23.1.3-canary.64
 - @vercel/client@10.2.3-canary.42
 - vercel-plugin-middleware@0.0.0-canary.17
 - vercel-plugin-go@1.0.0-canary.29
 - vercel-plugin-node@1.12.2-canary.33
 - vercel-plugin-python@1.0.0-canary.30
 - vercel-plugin-ruby@1.0.0-canary.29
2021-12-07 21:13:41 +01:00
Leo Lamprecht
0a468fd6d7 Correctly clean up files for CLI Plugins (#7149)
* Correctly clean up files for CLI Plugins

* Cleaned up the code
2021-12-07 21:13:29 +01:00
Leo Lamprecht
d31ebbabe4 Publish Canary
- @vercel/build-utils@2.12.3-canary.40
 - vercel@23.1.3-canary.63
 - @vercel/client@10.2.3-canary.41
 - vercel-plugin-middleware@0.0.0-canary.16
 - vercel-plugin-go@1.0.0-canary.28
 - vercel-plugin-node@1.12.2-canary.32
 - vercel-plugin-python@1.0.0-canary.29
 - vercel-plugin-ruby@1.0.0-canary.28
2021-12-07 17:46:08 +01:00
Leo Lamprecht
09c9b71adb Adjust import statements inside Runtime launchers (#7148)
* Added basic logic

* Polished basic logic

* Made logic actually replace content

* Perfected the logic

* Added comment

* Simplified logic

* Added another comment

* Added debug log

* More detailed debug log

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Steven <steven@ceriously.com>

* Simpler logic

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>
Co-authored-by: Steven <steven@ceriously.com>
2021-12-07 17:44:59 +01:00
Leo Lamprecht
5975db4d66 Fixed middleware tests (#7146) 2021-12-07 01:10:58 +01:00
Leo Lamprecht
2c86ac654c Publish Canary
- @vercel/build-utils@2.12.3-canary.39
 - vercel@23.1.3-canary.62
 - @vercel/client@10.2.3-canary.40
 - vercel-plugin-middleware@0.0.0-canary.15
 - vercel-plugin-go@1.0.0-canary.27
 - vercel-plugin-node@1.12.2-canary.31
 - vercel-plugin-python@1.0.0-canary.28
 - vercel-plugin-ruby@1.0.0-canary.27
2021-12-06 23:37:05 +01:00
Leo Lamprecht
ca5f066eb9 Simplify NFT output logic for CLI and CLI Plugins (#7143)
* Simplify NFT output logic for CLI Plugins

* Made tests pass

* Remove useless logic from Vercel CLI

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Steven <steven@ceriously.com>

* Simplified CLI code

* Removed useless file

Co-authored-by: Steven <steven@ceriously.com>
2021-12-06 23:35:07 +01:00
Leo Lamprecht
410ef86102 Support nested API Routes and fix handler for CLI Plugins (#7141)
We have identified that the `handler` for Lambdas does not support a dot-preceded path for Ruby, Python, and probably other languages, so we're adjusting the File System API to change `.output` inside the Lambda to something else, which requires version `2` of `functions-manifest.json`.

Furthermore, we're also bumping the `.nft.json` files to version `2`, which allows `output` to be relative to the NFT file itself, so that, inside the Lambda, the behavior mentioned at the top can be applied by the File System API.

As a nice side effect, this will also support nested API Routes, because it'll place all the dependencies next to every API Route, meaning that the launcher will have access to all of them (bundling multiple API Routes or Pages into the same Lambda currently doesn't work for non-Next.js anyways, because of https://github.com/vercel/runtimes/issues/305).

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-06 20:53:07 +00:00
Steven
6792edf32a Publish Canary
- @vercel/build-utils@2.12.3-canary.38
 - vercel@23.1.3-canary.61
 - @vercel/client@10.2.3-canary.39
 - @vercel/frameworks@0.5.1-canary.17
 - vercel-plugin-middleware@0.0.0-canary.14
 - vercel-plugin-go@1.0.0-canary.26
 - vercel-plugin-node@1.12.2-canary.30
 - vercel-plugin-python@1.0.0-canary.27
 - vercel-plugin-ruby@1.0.0-canary.26
2021-12-06 14:36:06 -05:00
Steven
67de167a7e [frameworks][cli] Remove duplicate getFsOutputDir() definitions (#7124) 2021-12-06 14:34:30 -05:00
Leo Lamprecht
0c5c05d90b Prevent CLI Plugins from overwriting files (#7140)
* Prevent CLI Plugins from overwriting files

* Revert "Temporarily remove CLI Plugin linking (#7138)"

This reverts commit d6a5aa4f6d.
2021-12-06 17:22:55 +01:00
62 changed files with 1356 additions and 756 deletions

View File

@@ -0,0 +1,9 @@
# No Single File Deployments
#### Why This Error Occurred
You attempted to create a Vercel deployment where the input is a file, rather than a directory. Previously this was allowed, however this behavior has been removed as of Vercel CLI v24.0.0 because it exposed a potential security risk if the user accidentally created a deployment from a sensitive file.
#### Possible Ways to Fix It
- Run the `vercel deploy` command against a directory, instead of a file.

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/build-utils",
"version": "2.12.3-canary.37",
"version": "2.12.3-canary.43",
"license": "MIT",
"main": "./dist/index.js",
"types": "./dist/index.d.js",
@@ -30,7 +30,7 @@
"@types/node-fetch": "^2.1.6",
"@types/semver": "6.0.0",
"@types/yazl": "^2.4.1",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/frameworks": "0.5.1-canary.17",
"@vercel/ncc": "0.24.0",
"aggregate-error": "3.0.1",
"async-retry": "1.2.3",

View File

@@ -3,7 +3,6 @@ import { join, parse, relative, dirname, basename, extname } from 'path';
import glob from './fs/glob';
import { normalizePath } from './fs/normalize-path';
import { FILES_SYMBOL, Lambda } from './lambda';
import type FileBlob from './file-blob';
import type { BuildOptions, Files } from './types';
import { debug, getIgnoreFilter } from '.';
@@ -87,10 +86,10 @@ export function convertRuntimeToPlugin(
const pages: { [key: string]: any } = {};
const pluginName = packageName.replace('vercel-plugin-', '');
const outputPath = join(workPath, '.output');
const traceDir = join(
workPath,
`.output`,
outputPath,
`inputs`,
// Legacy Runtimes can only provide API Routes, so that's
// why we can use this prefix for all of them. Here, we have to
@@ -101,11 +100,7 @@ export function convertRuntimeToPlugin(
await fs.ensureDir(traceDir);
let newPathsRuntime: Set<string> = new Set();
let linkersRuntime: Array<Promise<void>> = [];
const entryDir = join('.output', 'server', 'pages');
const entryRoot = join(workPath, entryDir);
const entryRoot = join(outputPath, 'server', 'pages');
for (const entrypoint of Object.keys(entrypoints)) {
const { output } = await buildRuntime({
@@ -117,20 +112,10 @@ export function convertRuntimeToPlugin(
},
meta: {
avoidTopLevelInstall: true,
skipDownload: true,
},
});
// Legacy Runtimes tend to pollute the `workPath` with compiled results,
// because the `workPath` used to be a place that was a place where they could
// just put anything, but nowadays it's the working directory of the `vercel build`
// command, which is the place where the developer keeps their source files,
// so we don't want to pollute this space unnecessarily. That means we have to clean
// up files that were created by the build, which is done further below.
const sourceFilesAfterBuild = await getSourceFiles(
workPath,
ignoreFilter
);
// @ts-ignore This symbol is a private API
const lambdaFiles: Files = output[FILES_SYMBOL];
@@ -146,6 +131,7 @@ export function convertRuntimeToPlugin(
let handlerFileBase = output.handler;
let handlerFile = lambdaFiles[handlerFileBase];
let handlerHasImport = false;
const { handler } = output;
const handlerMethod = handler.split('.').pop();
@@ -159,6 +145,7 @@ export function convertRuntimeToPlugin(
if (!handlerFile) {
handlerFileBase = handlerFileName + ext;
handlerFile = lambdaFiles[handlerFileBase];
handlerHasImport = true;
}
if (!handlerFile || !handlerFile.fsPath) {
@@ -173,143 +160,115 @@ export function convertRuntimeToPlugin(
const entryPath = join(dirname(entrypoint), entryBase);
const entry = join(entryRoot, entryPath);
// We never want to link here, only copy, because the launcher
// file often has the same name for every entrypoint, which means that
// every build for every entrypoint overwrites the launcher of the previous
// one, so linking would end with a broken reference.
// Create the parent directory of the API Route that will be created
// for the current entrypoint inside of `.output/server/pages/api`.
await fs.ensureDir(dirname(entry));
await fs.copy(handlerFile.fsPath, entry);
const newFilesEntrypoint: Array<string> = [];
const newDirectoriesEntrypoint: Array<string> = [];
// For compiled languages, the launcher file will be binary and therefore
// won't try to import a user-provided request handler (instead, it will
// contain it). But for interpreted languages, the launcher might try to
// load a user-provided request handler from the source file instead of bundling
// it, so we have to adjust the import statement inside the launcher to point
// to the respective source file. Previously, Legacy Runtimes simply expected
// the user-provided request-handler to be copied right next to the launcher,
// but with the new File System API, files won't be moved around unnecessarily.
if (handlerHasImport) {
const { fsPath } = handlerFile;
const encoding = 'utf-8';
const preBuildFiles = Object.values(sourceFilesPreBuild).map(file => {
return file.fsPath;
});
// This is the true directory of the user-provided request handler in the
// source files, so that's what we will use as an import path in the launcher.
const locationPrefix = relative(entry, outputPath);
// Generate a list of directories and files that weren't present
// before the entrypoint was processed by the Legacy Runtime, so
// that we can perform a cleanup later. We need to divide into files
// and directories because only cleaning up files might leave empty
// directories, and listing directories separately also speeds up the
// build because we can just delete them, which wipes all of their nested
// paths, instead of iterating through all files that should be deleted.
for (const file in sourceFilesAfterBuild) {
if (!sourceFilesPreBuild[file]) {
const path = sourceFilesAfterBuild[file].fsPath;
const dirPath = dirname(path);
let handlerContent = await fs.readFile(fsPath, encoding);
// If none of the files that were present before the entrypoint
// was processed are contained within the directory we're looking
// at right now, then we know it's a newly added directory
// and it can therefore be removed later on.
const isNewDir = !preBuildFiles.some(filePath => {
return dirname(filePath).startsWith(dirPath);
});
const importPaths = [
// This is the full entrypoint path, like `./api/test.py`. In our tests
// Python didn't support importing from a parent directory without using different
// code in the launcher that registers it as a location for modules and then changing
// the importing syntax, but continuing to import it like before seems to work. If
// other languages need this, we should consider excluding Python explicitly.
// `./${entrypoint}`,
// Check out the list of tracked directories that were
// newly added and see if one of them contains the path
// we're looking at.
const hasParentDir = newDirectoriesEntrypoint.some(dir => {
return path.startsWith(dir);
});
// This is the entrypoint path without extension, like `api/test`
entrypoint.slice(0, -ext.length),
];
// If we have already tracked a directory that was newly
// added that sits above the file or directory that we're
// looking at, we don't need to add more entries to the list
// because when the parent will get removed in the future,
// all of its children (and therefore the path we're looking at)
// will automatically get removed anyways.
if (hasParentDir) {
continue;
}
// Generate a list of regular expressions that we can use for
// finding matches, but only allow matches if the import path is
// wrapped inside single (') or double quotes (").
const patterns = importPaths.map(path => {
// eslint-disable-next-line no-useless-escape
return new RegExp(`('|")(${path.replace(/\./g, '\\.')})('|")`, 'g');
});
if (isNewDir) {
newDirectoriesEntrypoint.push(dirPath);
} else {
newFilesEntrypoint.push(path);
}
}
}
let replacedMatch = null;
const tracedFiles: {
absolutePath: string;
relativePath: string;
}[] = [];
for (const pattern of patterns) {
const newContent = handlerContent.replace(
pattern,
(_, p1, p2, p3) => {
return `${p1}${join(locationPrefix, p2)}${p3}`;
}
);
const linkers = Object.entries(lambdaFiles).map(
async ([relPath, file]) => {
const newPath = join(traceDir, relPath);
// The handler was already moved into position above.
if (relPath === handlerFileBase) {
return;
}
tracedFiles.push({ absolutePath: newPath, relativePath: relPath });
const { fsPath, type } = file;
if (fsPath) {
await fs.ensureDir(dirname(newPath));
const isNewFile = newFilesEntrypoint.includes(fsPath);
const isInsideNewDirectory = newDirectoriesEntrypoint.some(
dirPath => {
return fsPath.startsWith(dirPath);
}
if (newContent !== handlerContent) {
debug(
`Replaced "${pattern}" inside "${entry}" to ensure correct import of user-provided request handler`
);
// With this, we're making sure that files in the `workPath` that existed
// before the Legacy Runtime was invoked (source files) are linked from
// `.output` instead of copying there (the latter only happens if linking fails),
// which is the fastest solution. However, files that are created fresh
// by the Legacy Runtimes are always copied, because their link destinations
// are likely to be overwritten every time an entrypoint is processed by
// the Legacy Runtime. This is likely to overwrite the destination on subsequent
// runs, but that's also how `workPath` used to work originally, without
// the File System API (meaning that there was one `workPath` for all entrypoints).
if (isNewFile || isInsideNewDirectory) {
debug(`Copying from ${fsPath} to ${newPath}`);
await fs.copy(fsPath, newPath);
} else {
await linkOrCopy(fsPath, newPath);
}
} else if (type === 'FileBlob') {
const { data, mode } = file as FileBlob;
await fs.writeFile(newPath, data, { mode });
} else {
throw new Error(`Unknown file type: ${type}`);
handlerContent = newContent;
replacedMatch = true;
}
}
);
linkersRuntime = linkersRuntime.concat(linkers);
if (!replacedMatch) {
new Error(
`No replacable matches for "${importPaths[0]}" or "${importPaths[1]}" found in "${fsPath}"`
);
}
await fs.writeFile(entry, handlerContent, encoding);
} else {
await fs.copy(handlerFile.fsPath, entry);
}
// Legacy Runtimes based on interpreted languages will create a new launcher file
// for every entrypoint, but they will create each one inside `workPath`, which means that
// the launcher for one entrypoint will overwrite the launcher provided for the previous
// entrypoint. That's why, above, we copy the file contents into the new destination (and
// optionally transform them along the way), instead of linking. We then also want to remove
// the copy origin right here, so that the `workPath` doesn't contain a useless launcher file
// once the build has finished running.
await fs.remove(handlerFile.fsPath);
debug(`Removed temporary file "${handlerFile.fsPath}"`);
const nft = `${entry}.nft.json`;
const json = JSON.stringify({
version: 1,
files: tracedFiles.map(file => ({
input: normalizePath(relative(dirname(nft), file.absolutePath)),
// We'd like to place all the dependency files right next
// to the final launcher file inside of the Lambda.
output: normalizePath(join(entryDir, 'api', file.relativePath)),
})),
version: 2,
files: Object.keys(lambdaFiles)
.map(file => {
const { fsPath } = lambdaFiles[file];
if (!fsPath) {
throw new Error(
`File "${file}" is missing valid \`fsPath\` property`
);
}
// The handler was already moved into position above.
if (file === handlerFileBase) {
return;
}
return normalizePath(relative(dirname(nft), fsPath));
})
.filter(Boolean),
});
await fs.ensureDir(dirname(nft));
await fs.writeFile(nft, json);
// Extend the list of directories and files that were created by the
// Legacy Runtime with the list of directories and files that were
// created for the entrypoint that was just processed above.
newPathsRuntime = new Set([
...newPathsRuntime,
...newFilesEntrypoint,
...newDirectoriesEntrypoint,
]);
// Add an entry that will later on be added to the `functions-manifest.json`
// file that is placed inside of the `.output` directory.
pages[normalizePath(entryPath)] = {
@@ -326,36 +285,12 @@ export function convertRuntimeToPlugin(
};
}
// Instead of of waiting for all of the linking to be done for every
// entrypoint before processing the next one, we immediately handle all
// of them one after the other, while then waiting for the linking
// to finish right here, before we clean up newly created files below.
await Promise.all(linkersRuntime);
// A list of all the files that were created by the Legacy Runtime,
// which we'd like to remove from the File System.
const toRemove = Array.from(newPathsRuntime).map(path => {
debug(`Removing ${path} as part of cleanup`);
return fs.remove(path);
});
// Once all the entrypoints have been processed, we'd like to
// remove all the files from `workPath` that originally weren't present
// before the Legacy Runtime began running, because the `workPath`
// is nowadays the directory in which the user keeps their source code, since
// we're no longer running separate parallel builds for every Legacy Runtime.
await Promise.all(toRemove);
// Add any Serverless Functions that were exposed by the Legacy Runtime
// to the `functions-manifest.json` file provided in `.output`.
await updateFunctionsManifest({ workPath, pages });
};
}
async function linkOrCopy(existingPath: string, newPath: string) {
await fs.copyFile(existingPath, newPath);
}
async function readJson(filePath: string): Promise<{ [key: string]: any }> {
try {
const str = await fs.readFile(filePath, 'utf8');
@@ -386,7 +321,7 @@ export async function updateFunctionsManifest({
);
const functionsManifest = await readJson(functionsManifestPath);
if (!functionsManifest.version) functionsManifest.version = 1;
if (!functionsManifest.version) functionsManifest.version = 2;
if (!functionsManifest.pages) functionsManifest.pages = {};
for (const [pageKey, pageConfig] of Object.entries(pages)) {

View File

@@ -3,7 +3,13 @@ import { valid as validSemver } from 'semver';
import { parse as parsePath, extname } from 'path';
import { Route, Source } from '@vercel/routing-utils';
import frameworkList, { Framework } from '@vercel/frameworks';
import { PackageJson, Builder, Config, BuilderFunctions } from './types';
import {
PackageJson,
Builder,
Config,
BuilderFunctions,
ProjectSettings,
} from './types';
import { isOfficialRuntime } from './';
const slugToFramework = new Map<string | null, Framework>(
frameworkList.map(f => [f.slug, f])
@@ -20,14 +26,7 @@ interface Options {
tag?: 'canary' | 'latest' | string;
functions?: BuilderFunctions;
ignoreBuildScript?: boolean;
projectSettings?: {
framework?: string | null;
devCommand?: string | null;
installCommand?: string | null;
buildCommand?: string | null;
outputDirectory?: string | null;
createdAt?: number;
};
projectSettings?: ProjectSettings;
cleanUrls?: boolean;
trailingSlash?: boolean;
featHandleMiss?: boolean;

View File

@@ -0,0 +1,216 @@
import semver from 'semver';
import { isOfficialRuntime } from './';
import type {
Builder,
BuilderFunctions,
PackageJson,
ProjectSettings,
} from './types';
interface Metadata {
plugins: string[];
hasDotOutput: boolean;
hasMiddleware: boolean;
}
const enableFileSystemApiFrameworks = new Set(['solidstart']);
/**
* If the Deployment can be built with the new File System API,
* we'll return the new Builder here, otherwise return `null`.
*/
export async function detectFileSystemAPI({
files,
projectSettings,
builders,
vercelConfig,
pkg,
tag,
enableFlag = false,
}: {
files: { [relPath: string]: any };
projectSettings: ProjectSettings;
builders: Builder[];
vercelConfig:
| { builds?: Builder[]; functions?: BuilderFunctions }
| null
| undefined;
pkg: PackageJson | null | undefined;
tag: string | undefined;
enableFlag: boolean | undefined;
}): Promise<
| { metadata: Metadata; fsApiBuilder: Builder; reason: null }
| { metadata: Metadata; fsApiBuilder: null; reason: string }
> {
const framework = projectSettings.framework || '';
const deps = Object.assign({}, pkg?.dependencies, pkg?.devDependencies);
const plugins = Object.keys(deps).filter(dep =>
dep.startsWith('vercel-plugin-')
);
const hasDotOutput = Object.keys(files).some(file =>
file.startsWith('.output/')
);
const hasMiddleware = Boolean(
files['_middleware.js'] || files['_middleware.ts']
);
const metadata: Metadata = {
plugins,
hasDotOutput,
hasMiddleware,
};
const isEnabled =
enableFlag ||
hasMiddleware ||
hasDotOutput ||
enableFileSystemApiFrameworks.has(framework);
if (!isEnabled) {
return { metadata, fsApiBuilder: null, reason: 'Flag not enabled.' };
}
if (vercelConfig?.builds && vercelConfig.builds.length > 0) {
return {
metadata,
fsApiBuilder: null,
reason:
'Detected `builds` in vercel.json. Please remove it in favor of CLI plugins.',
};
}
if (Object.values(vercelConfig?.functions || {}).some(fn => !!fn.runtime)) {
return {
metadata,
fsApiBuilder: null,
reason:
'Detected `functions.runtime` in vercel.json. Please remove it in favor of CLI plugins.',
};
}
if (process.env.HUGO_VERSION) {
return {
metadata,
fsApiBuilder: null,
reason: 'Detected `HUGO_VERSION` environment variable. Please remove it.',
};
}
if (process.env.ZOLA_VERSION) {
return {
metadata,
fsApiBuilder: null,
reason: 'Detected `ZOLA_VERSION` environment variable. Please remove it.',
};
}
if (process.env.GUTENBERG_VERSION) {
return {
metadata,
fsApiBuilder: null,
reason:
'Detected `GUTENBERG_VERSION` environment variable. Please remove it.',
};
}
const invalidBuilder = builders.find(({ use }) => {
const valid =
isOfficialRuntime('go', use) ||
isOfficialRuntime('python', use) ||
isOfficialRuntime('ruby', use) ||
isOfficialRuntime('node', use) ||
isOfficialRuntime('next', use) ||
isOfficialRuntime('static', use) ||
isOfficialRuntime('static-build', use);
return !valid;
});
if (invalidBuilder) {
return {
metadata,
fsApiBuilder: null,
reason: `Detected \`${invalidBuilder.use}\` in vercel.json. Please remove it in favor of CLI plugins.`,
};
}
for (const lang of ['go', 'python', 'ruby']) {
for (const { use } of builders) {
const plugin = 'vercel-plugin-' + lang;
if (isOfficialRuntime(lang, use) && !deps[plugin]) {
return {
metadata,
fsApiBuilder: null,
reason: `Detected \`${lang}\` Serverless Function usage without plugin \`${plugin}\`. Please run \`npm i ${plugin}\`.`,
};
}
}
}
if (
framework === 'nuxtjs' ||
framework === 'sveltekit' ||
framework === 'redwoodjs'
) {
return {
metadata,
fsApiBuilder: null,
reason: `Detected framework \`${framework}\` that only supports legacy File System API. Please contact the framework author.`,
};
}
if (framework === 'nextjs' && !hasDotOutput) {
// Use the old pipeline if a custom output directory was specified for Next.js
// because `vercel build` cannot ensure that the directory will be in the same
// location as `.output`, which can break imports (not just nft.json files).
if (projectSettings?.outputDirectory) {
return {
metadata,
fsApiBuilder: null,
reason: `Detected Next.js with Output Directory \`${projectSettings.outputDirectory}\` override. Please change it back to the default.`,
};
}
const versionRange = deps['next'];
if (!versionRange) {
return {
metadata,
fsApiBuilder: null,
reason: `Detected Next.js in Project Settings but missing \`next\` package.json dependencies. Please run \`npm i next\`.`,
};
}
// TODO: We'll need to check the lockfile if one is present.
if (versionRange !== 'latest' && versionRange !== 'canary') {
const fixedVersion = semver.valid(semver.coerce(versionRange) || '');
if (!fixedVersion || !semver.gte(fixedVersion, '12.0.0')) {
return {
metadata,
fsApiBuilder: null,
reason: `Detected legacy Next.js version "${versionRange}" in package.json. Please run \`npm i next@latest\` to upgrade.`,
};
}
}
}
const frontendBuilder = builders.find(
({ use }) =>
isOfficialRuntime('next', use) ||
isOfficialRuntime('static', use) ||
isOfficialRuntime('static-build', use)
);
const config = frontendBuilder?.config || {};
const withTag = tag ? `@${tag}` : '';
const fsApiBuilder = {
use: `@vercelruntimes/file-system-api${withTag}`,
src: '**',
config: {
...config,
fileSystemAPI: true,
framework: config.framework || framework || null,
projectSettings,
hasMiddleware,
hasDotOutput,
},
};
return { metadata, fsApiBuilder, reason: null };
}

View File

@@ -81,6 +81,7 @@ export {
detectApiDirectory,
detectApiExtensions,
} from './detect-builders';
export { detectFileSystemAPI } from './detect-file-system-api';
export { detectFramework } from './detect-framework';
export { DetectorFilesystem } from './detectors/filesystem';
export { readConfigFile } from './fs/read-config-file';

View File

@@ -29,7 +29,9 @@ export interface Config {
| number
| { [key: string]: string }
| BuilderFunctions
| undefined;
| ProjectSettings
| undefined
| null;
maxLambdaSize?: string;
includeFiles?: string | string[];
excludeFiles?: string | string[];
@@ -41,11 +43,12 @@ export interface Config {
zeroConfig?: boolean;
import?: { [key: string]: string };
functions?: BuilderFunctions;
projectSettings?: ProjectSettings;
outputDirectory?: string;
installCommand?: string;
buildCommand?: string;
devCommand?: string;
framework?: string;
framework?: string | null;
nodeVersion?: string;
}
@@ -351,3 +354,17 @@ export interface BuilderFunctions {
excludeFiles?: string;
};
}
export interface ProjectSettings {
framework?: string | null;
devCommand?: string | null;
installCommand?: string | null;
buildCommand?: string | null;
outputDirectory?: string | null;
rootDirectory?: string | null;
createdAt?: number;
autoExposeSystemEnvs?: boolean;
sourceFilesOutsideRootDirectory?: boolean;
directoryListing?: boolean;
gitForkProtection?: boolean;
}

View File

@@ -62,7 +62,6 @@ describe('convert-runtime-to-plugin', () => {
return { output: lambda };
};
const lambdaFiles = await fsToJson(workPath);
const packageName = 'vercel-plugin-python';
const build = await convertRuntimeToPlugin(buildRuntime, packageName, ext);
@@ -70,14 +69,8 @@ describe('convert-runtime-to-plugin', () => {
const output = await fsToJson(join(workPath, '.output'));
delete lambdaFiles['vercel.json'];
delete lambdaFiles['vc__handler__python.py'];
expect(output).toMatchObject({
'functions-manifest.json': expect.stringContaining('{'),
inputs: {
'api-routes-python': lambdaFiles,
},
server: {
pages: {
api: {
@@ -96,7 +89,7 @@ describe('convert-runtime-to-plugin', () => {
const funcManifest = JSON.parse(output['functions-manifest.json']);
expect(funcManifest).toMatchObject({
version: 1,
version: 2,
pages: {
'api/index.py': { ...lambdaOptions, handler: 'index.vc_handler' },
'api/users/get.py': { ...lambdaOptions, handler: 'get.vc_handler' },
@@ -110,40 +103,16 @@ describe('convert-runtime-to-plugin', () => {
const indexJson = JSON.parse(output.server.pages.api['index.py.nft.json']);
expect(indexJson).toMatchObject({
version: 1,
version: 2,
files: [
{
input: `../../../inputs/api-routes-python/api/db/[id].py`,
output: '.output/server/pages/api/api/db/[id].py',
},
{
input: `../../../inputs/api-routes-python/api/index.py`,
output: '.output/server/pages/api/api/index.py',
},
{
input: `../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: '.output/server/pages/api/api/project/[aid]/[bid]/index.py',
},
{
input: `../../../inputs/api-routes-python/api/users/get.py`,
output: '.output/server/pages/api/api/users/get.py',
},
{
input: `../../../inputs/api-routes-python/api/users/post.py`,
output: '.output/server/pages/api/api/users/post.py',
},
{
input: `../../../inputs/api-routes-python/file.txt`,
output: '.output/server/pages/api/file.txt',
},
{
input: `../../../inputs/api-routes-python/util/date.py`,
output: '.output/server/pages/api/util/date.py',
},
{
input: `../../../inputs/api-routes-python/util/math.py`,
output: '.output/server/pages/api/util/math.py',
},
'../../../../api/db/[id].py',
'../../../../api/index.py',
'../../../../api/project/[aid]/[bid]/index.py',
'../../../../api/users/get.py',
'../../../../api/users/post.py',
'../../../../file.txt',
'../../../../util/date.py',
'../../../../util/math.py',
],
});
@@ -151,40 +120,16 @@ describe('convert-runtime-to-plugin', () => {
output.server.pages.api.users['get.py.nft.json']
);
expect(getJson).toMatchObject({
version: 1,
version: 2,
files: [
{
input: `../../../../inputs/api-routes-python/api/db/[id].py`,
output: '.output/server/pages/api/api/db/[id].py',
},
{
input: `../../../../inputs/api-routes-python/api/index.py`,
output: '.output/server/pages/api/api/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: '.output/server/pages/api/api/project/[aid]/[bid]/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/get.py`,
output: '.output/server/pages/api/api/users/get.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/post.py`,
output: '.output/server/pages/api/api/users/post.py',
},
{
input: `../../../../inputs/api-routes-python/file.txt`,
output: '.output/server/pages/api/file.txt',
},
{
input: `../../../../inputs/api-routes-python/util/date.py`,
output: '.output/server/pages/api/util/date.py',
},
{
input: `../../../../inputs/api-routes-python/util/math.py`,
output: '.output/server/pages/api/util/math.py',
},
'../../../../../api/db/[id].py',
'../../../../../api/index.py',
'../../../../../api/project/[aid]/[bid]/index.py',
'../../../../../api/users/get.py',
'../../../../../api/users/post.py',
'../../../../../file.txt',
'../../../../../util/date.py',
'../../../../../util/math.py',
],
});
@@ -192,40 +137,16 @@ describe('convert-runtime-to-plugin', () => {
output.server.pages.api.users['post.py.nft.json']
);
expect(postJson).toMatchObject({
version: 1,
version: 2,
files: [
{
input: `../../../../inputs/api-routes-python/api/db/[id].py`,
output: '.output/server/pages/api/api/db/[id].py',
},
{
input: `../../../../inputs/api-routes-python/api/index.py`,
output: '.output/server/pages/api/api/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: '.output/server/pages/api/api/project/[aid]/[bid]/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/get.py`,
output: '.output/server/pages/api/api/users/get.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/post.py`,
output: '.output/server/pages/api/api/users/post.py',
},
{
input: `../../../../inputs/api-routes-python/file.txt`,
output: '.output/server/pages/api/file.txt',
},
{
input: `../../../../inputs/api-routes-python/util/date.py`,
output: '.output/server/pages/api/util/date.py',
},
{
input: `../../../../inputs/api-routes-python/util/math.py`,
output: '.output/server/pages/api/util/math.py',
},
'../../../../../api/db/[id].py',
'../../../../../api/index.py',
'../../../../../api/project/[aid]/[bid]/index.py',
'../../../../../api/users/get.py',
'../../../../../api/users/post.py',
'../../../../../file.txt',
'../../../../../util/date.py',
'../../../../../util/math.py',
],
});

View File

@@ -0,0 +1,427 @@
import { detectFileSystemAPI } from '../src/detect-file-system-api';
describe('Test `detectFileSystemAPI`', () => {
it('should error when builds in vercel.json', async () => {
const vercelConfig = {
builds: [{ use: '@vercel/node', src: 'api/**/*.js' }],
};
const files = {
'vercel.json': JSON.stringify(vercelConfig),
'api/foo.js': 'console.log("foo")',
};
const result = await detectFileSystemAPI({
files,
projectSettings: {},
builders: vercelConfig.builds,
vercelConfig,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected `builds` in vercel.json. Please remove it in favor of CLI plugins.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when functions.runtimes in vercel.json', async () => {
const vercelConfig = {
functions: {
'api/**/*.rs': {
runtime: 'vercel-rust@latest',
},
},
};
const files = {
'vercel.json': JSON.stringify(vercelConfig),
'api/foo.rs': 'println!("foo")',
};
const result = await detectFileSystemAPI({
files,
projectSettings: {},
builders: [],
vercelConfig,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected `functions.runtime` in vercel.json. Please remove it in favor of CLI plugins.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when HUGO_VERSION env var used', async () => {
process.env.HUGO_VERSION = 'v0.58.2';
const files = { 'foo.html': '<h1>Foo</h1>' };
const result = await detectFileSystemAPI({
files,
projectSettings: {},
builders: [],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason: 'Detected `HUGO_VERSION` environment variable. Please remove it.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
delete process.env.HUGO_VERSION;
});
it('should error when ZOLA_VERSION env var used', async () => {
process.env.ZOLA_VERSION = 'v0.0.1';
const files = { 'foo.html': '<h1>Foo</h1>' };
const result = await detectFileSystemAPI({
files,
projectSettings: {},
builders: [],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason: 'Detected `ZOLA_VERSION` environment variable. Please remove it.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
delete process.env.ZOLA_VERSION;
});
it('should error when GUTENBERG_VERSION env var used', async () => {
process.env.GUTENBERG_VERSION = 'v0.0.1';
const files = { 'foo.html': '<h1>Foo</h1>' };
const result = await detectFileSystemAPI({
files,
projectSettings: {},
builders: [],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected `GUTENBERG_VERSION` environment variable. Please remove it.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
delete process.env.GUTENBERG_VERSION;
});
it('should error when Go detected without corresponding plugin', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.go': 'print("foo")' },
projectSettings: {},
builders: [{ use: '@vercel/go', src: 'api/**/*.go' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected `go` Serverless Function usage without plugin `vercel-plugin-go`. Please run `npm i vercel-plugin-go`.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when Python detected without corresponding plugin', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.py': 'print("foo")' },
projectSettings: {},
builders: [{ use: '@vercel/python', src: 'api/**/*.py' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected `python` Serverless Function usage without plugin `vercel-plugin-python`. Please run `npm i vercel-plugin-python`.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when Ruby detected without corresponding plugin', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.rb': 'print("foo")' },
projectSettings: {},
builders: [{ use: '@vercel/ruby', src: 'api/**/*.rb' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected `ruby` Serverless Function usage without plugin `vercel-plugin-ruby`. Please run `npm i vercel-plugin-ruby`.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should succeed when Go detected with corresponding plugin', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.go': 'print("foo")' },
projectSettings: {},
builders: [{ use: '@vercel/go', src: 'api/**/*.go' }],
vercelConfig: null,
pkg: { dependencies: { 'vercel-plugin-go': '^1.0.0' } },
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: {
use: '@vercelruntimes/file-system-api',
src: '**',
config: {
fileSystemAPI: true,
framework: null,
hasDotOutput: false,
hasMiddleware: false,
projectSettings: {},
},
},
reason: null,
metadata: {
hasDotOutput: false,
hasMiddleware: false,
plugins: ['vercel-plugin-go'],
},
});
});
it('should succeed when Python detected with corresponding plugin', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.py': 'print("foo")' },
projectSettings: {},
builders: [{ use: '@vercel/python', src: 'api/**/*.py' }],
vercelConfig: null,
pkg: { dependencies: { 'vercel-plugin-python': '^1.0.0' } },
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: {
use: '@vercelruntimes/file-system-api',
src: '**',
config: {
fileSystemAPI: true,
framework: null,
hasDotOutput: false,
hasMiddleware: false,
projectSettings: {},
},
},
reason: null,
metadata: {
hasDotOutput: false,
hasMiddleware: false,
plugins: ['vercel-plugin-python'],
},
});
});
it('should succeed when Ruby detected with corresponding plugin', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.rb': 'print("foo")' },
projectSettings: {},
builders: [{ use: '@vercel/ruby', src: 'api/**/*.rb' }],
vercelConfig: null,
pkg: { dependencies: { 'vercel-plugin-ruby': '^1.0.0' } },
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: {
use: '@vercelruntimes/file-system-api',
src: '**',
config: {
fileSystemAPI: true,
framework: null,
hasDotOutput: false,
hasMiddleware: false,
projectSettings: {},
},
},
reason: null,
metadata: {
hasDotOutput: false,
hasMiddleware: false,
plugins: ['vercel-plugin-ruby'],
},
});
});
it('should error when framework is nuxtjs', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.js': 'console.log("foo")' },
projectSettings: { framework: 'nuxtjs' },
builders: [{ use: '@vercel/node', src: 'api/**/*.js' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected framework `nuxtjs` that only supports legacy File System API. Please contact the framework author.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when framework is sveltekit', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.js': 'console.log("foo")' },
projectSettings: { framework: 'sveltekit' },
builders: [{ use: '@vercel/node', src: 'api/**/*.js' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected framework `sveltekit` that only supports legacy File System API. Please contact the framework author.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when framework is redwoodjs', async () => {
const result = await detectFileSystemAPI({
files: { 'api/foo.js': 'console.log("foo")' },
projectSettings: { framework: 'redwoodjs' },
builders: [{ use: '@vercel/node', src: 'api/**/*.js' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected framework `redwoodjs` that only supports legacy File System API. Please contact the framework author.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when framework is nextjs and has output dir', async () => {
const result = await detectFileSystemAPI({
files: { 'pages/foo.js': 'console.log("foo")' },
projectSettings: { framework: 'nextjs', outputDirectory: 'dist' },
builders: [{ use: '@vercel/next', src: 'package.json' }],
vercelConfig: null,
pkg: { dependencies: { next: '^12.0.0' } },
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected Next.js with Output Directory `dist` override. Please change it back to the default.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when framework is nextjs but missing from dependencies', async () => {
const result = await detectFileSystemAPI({
files: { 'pages/foo.js': 'console.log("foo")' },
projectSettings: { framework: 'nextjs' },
builders: [{ use: '@vercel/next', src: 'package.json' }],
vercelConfig: null,
pkg: { dependencies: { 'not-next': '^12.0.0' } },
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected Next.js in Project Settings but missing `next` package.json dependencies. Please run `npm i next`.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should error when framework is nextjs but dependency is older version', async () => {
const result = await detectFileSystemAPI({
files: { 'pages/foo.js': 'console.log("foo")' },
projectSettings: { framework: 'nextjs' },
builders: [{ use: '@vercel/next', src: 'package.json' }],
vercelConfig: null,
pkg: { dependencies: { next: '^9.0.0' } },
tag: '',
enableFlag: true,
});
expect(result).toEqual({
fsApiBuilder: null,
reason:
'Detected legacy Next.js version "^9.0.0" in package.json. Please run `npm i next@latest` to upgrade.',
metadata: { hasDotOutput: false, hasMiddleware: false, plugins: [] },
});
});
it('should succeed when middleware detected', async () => {
const result = await detectFileSystemAPI({
files: { '_middleware.js': 'print("foo")' },
projectSettings: {},
builders: [{ use: '@vercel/static-build', src: 'package.json' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: false,
});
expect(result).toEqual({
fsApiBuilder: {
use: '@vercelruntimes/file-system-api',
src: '**',
config: {
fileSystemAPI: true,
framework: null,
hasDotOutput: false,
hasMiddleware: true,
projectSettings: {},
},
},
reason: null,
metadata: { hasDotOutput: false, hasMiddleware: true, plugins: [] },
});
});
it('should succeed when .output detected', async () => {
const result = await detectFileSystemAPI({
files: { '.output/routes-manifest.json': '{}' },
projectSettings: { framework: 'remix' },
builders: [{ use: '@vercel/static-build', src: 'package.json' }],
vercelConfig: null,
pkg: null,
tag: '',
enableFlag: false,
});
expect(result).toEqual({
fsApiBuilder: {
use: '@vercelruntimes/file-system-api',
src: '**',
config: {
fileSystemAPI: true,
framework: 'remix',
hasDotOutput: true,
hasMiddleware: false,
projectSettings: { framework: 'remix' },
},
},
reason: null,
metadata: { hasDotOutput: true, hasMiddleware: false, plugins: [] },
});
});
});

View File

@@ -34,7 +34,7 @@ Finally, [connect your Git repository to Vercel](https://vercel.com/docs/git) an
## Documentation
For details on how to use Vercel CLI, check out our [documentation](https://vercel.com/docs).
For details on how to use Vercel CLI, check out our [documentation](https://vercel.com/docs/cli).
## Local Development

View File

@@ -1,6 +1,6 @@
{
"name": "vercel",
"version": "23.1.3-canary.60",
"version": "23.1.3-canary.68",
"preferGlobal": true,
"license": "Apache-2.0",
"description": "The command-line interface for Vercel",
@@ -43,14 +43,14 @@
"node": ">= 12"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.37",
"@vercel/build-utils": "2.12.3-canary.43",
"@vercel/go": "1.2.4-canary.4",
"@vercel/node": "1.12.2-canary.7",
"@vercel/python": "2.1.2-canary.1",
"@vercel/python": "2.1.2-canary.2",
"@vercel/ruby": "1.2.10-canary.0",
"update-notifier": "4.1.0",
"vercel-plugin-middleware": "0.0.0-canary.13",
"vercel-plugin-node": "1.12.2-canary.29"
"vercel-plugin-middleware": "0.0.0-canary.20",
"vercel-plugin-node": "1.12.2-canary.35"
},
"devDependencies": {
"@next/env": "11.1.2",
@@ -90,7 +90,8 @@
"@types/update-notifier": "5.1.0",
"@types/which": "1.3.2",
"@types/write-json-file": "2.2.1",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/client": "10.2.3-canary.46",
"@vercel/frameworks": "0.5.1-canary.17",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.17.0",
"@zeit/fun": "0.11.2",

View File

@@ -6,20 +6,23 @@ import {
scanParentDirs,
spawnAsync,
glob as buildUtilsGlob,
detectFileSystemAPI,
detectBuilders,
PackageJson,
} from '@vercel/build-utils';
import { nodeFileTrace } from '@vercel/nft';
import Sema from 'async-sema';
import chalk from 'chalk';
import { SpawnOptions } from 'child_process';
import { assert } from 'console';
import { createHash } from 'crypto';
import fs from 'fs-extra';
import ogGlob from 'glob';
import { dirname, isAbsolute, join, parse, relative, resolve } from 'path';
import { dirname, isAbsolute, join, parse, relative } from 'path';
import pluralize from 'pluralize';
import Client from '../util/client';
import { VercelConfig } from '../util/dev/types';
import { emoji, prependEmoji } from '../util/emoji';
import { CantParseJSONFile } from '../util/errors-ts';
import getArgs from '../util/get-args';
import handleError from '../util/handle-error';
import confirm from '../util/input/confirm';
@@ -33,6 +36,7 @@ import { loadCliPlugins } from '../util/plugins';
import { findFramework } from '../util/projects/find-framework';
import { VERCEL_DIR } from '../util/projects/link';
import { readProjectSettings } from '../util/projects/project-settings';
import readJSONFile from '../util/read-json-file';
import pull from './pull';
const sema = new Sema(16, {
@@ -147,6 +151,36 @@ export default async function main(client: Client) {
process.chdir(cwd);
const pkg = await readJSONFile<PackageJson>('./package.json');
if (pkg instanceof CantParseJSONFile) {
throw pkg;
}
const vercelConfig = await readJSONFile<VercelConfig>('./vercel.json');
if (vercelConfig instanceof CantParseJSONFile) {
throw vercelConfig;
}
if (!process.env.NOW_BUILDER) {
// This validation is only necessary when
// a user runs `vercel build` locally.
const globFiles = await buildUtilsGlob('**', { cwd });
const zeroConfig = await detectBuilders(Object.keys(globFiles), pkg);
const { reason } = await detectFileSystemAPI({
files: globFiles,
projectSettings: project.settings,
builders: zeroConfig.builders || [],
pkg,
vercelConfig,
tag: '',
enableFlag: true,
});
if (reason) {
client.output.error(`${cmd(`${getPkgName()} build`)} failed: ${reason}`);
return 1;
}
}
const framework = findFramework(project.settings.framework);
// If this is undefined, we bail. If it is null, then findFramework should return "Other",
// so this should really never happen, but just in case....
@@ -354,18 +388,19 @@ export default async function main(client: Client) {
}
// We cannot rely on the `framework` alone, as it might be a static export,
// and the current build might use a differnt project that's not in the settings.
// and the current build might use a different project that's not in the settings.
const isNextOutput = Boolean(dotNextDir);
const nextExport = await getNextExportStatus(dotNextDir);
const outputDir =
isNextOutput && !nextExport ? OUTPUT_DIR : join(OUTPUT_DIR, 'static');
const getDistDir = framework.getFsOutputDir || framework.getOutputDirName;
const distDir =
(nextExport?.exportDetail.outDirectory
? relative(cwd, nextExport.exportDetail.outDirectory)
: false) ||
dotNextDir ||
userOutputDirectory ||
(await framework.getFsOutputDir(cwd));
(await getDistDir(cwd));
await fs.ensureDir(join(cwd, outputDir));
@@ -636,30 +671,15 @@ export default async function main(client: Client) {
],
});
fileList.delete(relative(cwd, f));
await resolveNftToOutput({
client,
baseDir,
outputDir: OUTPUT_DIR,
nftFileName: f.replace(ext, '.js.nft.json'),
distDir,
nft: {
version: 1,
files: Array.from(fileList).map(fileListEntry =>
relative(dir, fileListEntry)
),
},
});
}
} else {
for (let f of nftFiles) {
const json = await fs.readJson(f);
await resolveNftToOutput({
client,
baseDir,
outputDir: OUTPUT_DIR,
nftFileName: f,
nft: json,
distDir,
const nftFileName = f.replace(ext, '.js.nft.json');
client.output.debug(`Creating ${nftFileName}`);
await fs.writeJSON(nftFileName, {
version: 2,
files: Array.from(fileList).map(fileListEntry =>
relative(dir, fileListEntry)
),
});
}
}
@@ -683,15 +703,7 @@ export default async function main(client: Client) {
const originalPath = join(requiredServerFilesJson.appDir, i);
const relPath = join(OUTPUT_DIR, relative(distDir, originalPath));
const absolutePath = join(cwd, relPath);
const output = relative(baseDir, absolutePath);
return relPath === output
? relPath
: {
input: relPath,
output,
};
return relPath;
}),
});
}
@@ -848,88 +860,6 @@ async function glob(pattern: string, options: GlobOptions): Promise<string[]> {
});
}
/**
* Computes a hash for the given buf.
*
* @param {Buffer} file data
* @return {String} hex digest
*/
function hash(buf: Buffer): string {
return createHash('sha1').update(buf).digest('hex');
}
interface NftFile {
version: number;
files: (string | { input: string; output: string })[];
}
// resolveNftToOutput takes nft file and moves all of its trace files
// into the specified directory + `inputs`, (renaming them to their hash + ext) and
// subsequently updating the original nft file accordingly. This is done
// to make the `.output` directory be self-contained, so that it works
// properly with `vc --prebuilt`.
async function resolveNftToOutput({
client,
baseDir,
outputDir,
nftFileName,
distDir,
nft,
}: {
client: Client;
baseDir: string;
outputDir: string;
nftFileName: string;
distDir: string;
nft: NftFile;
}) {
client.output.debug(`Processing and resolving ${nftFileName}`);
await fs.ensureDir(join(outputDir, 'inputs'));
const newFilesList: NftFile['files'] = [];
// If `distDir` is a subdirectory, then the input has to be resolved to where the `.output` directory will be.
const relNftFileName = relative(outputDir, nftFileName);
const origNftFilename = join(distDir, relNftFileName);
if (relNftFileName.startsWith('cache/')) {
// No need to process the `cache/` directory.
// Paths in it might also not be relative to `cache` itself.
return;
}
for (let fileEntity of nft.files) {
const relativeInput =
typeof fileEntity === 'string' ? fileEntity : fileEntity.input;
const fullInput = resolve(join(parse(origNftFilename).dir, relativeInput));
// if the resolved path is NOT in the .output directory we move in it there
if (!fullInput.includes(distDir)) {
const { ext } = parse(fullInput);
const raw = await fs.readFile(fullInput);
const newFilePath = join(outputDir, 'inputs', hash(raw) + ext);
smartCopy(client, fullInput, newFilePath);
// We have to use `baseDir` instead of `cwd`, because we want to
// mount everything from there (especially `node_modules`).
// This is important for NPM Workspaces where `node_modules` is not
// in the directory of the workspace.
const output = relative(baseDir, fullInput).replace('.output', '.next');
newFilesList.push({
input: relative(parse(nftFileName).dir, newFilePath),
output,
});
} else {
newFilesList.push(relativeInput);
}
}
// Update the .nft.json with new input and output mapping
await fs.writeJSON(nftFileName, {
...nft,
files: newFilesList,
});
}
/**
* Files will only exist when `next export` was used.
*/

View File

@@ -166,8 +166,8 @@ export default async (client: Client) => {
return pathValidation.exitCode;
}
const { isFile, path } = pathValidation;
const autoConfirm = argv['--confirm'] || isFile;
const { path } = pathValidation;
const autoConfirm = argv['--confirm'];
// deprecate --name
if (argv['--name']) {
@@ -192,7 +192,7 @@ export default async (client: Client) => {
let newProjectName = null;
let rootDirectory = project ? project.rootDirectory : null;
let sourceFilesOutsideRootDirectory = true;
let sourceFilesOutsideRootDirectory: boolean | undefined = true;
if (status === 'not_linked') {
const shouldStartSetup =
@@ -229,8 +229,7 @@ export default async (client: Client) => {
// user input.
const detectedProjectName = getProjectName({
argv,
nowConfig: localConfig || {},
isFile,
nowConfig: localConfig,
paths,
});
@@ -447,9 +446,9 @@ export default async (client: Client) => {
forceNew: argv['--force'],
withCache: argv['--with-cache'],
prebuilt: argv['--prebuilt'],
rootDirectory,
quiet,
wantsPublic: argv['--public'] || localConfig.public,
isFile,
type: null,
nowConfig: localConfig,
regions,
@@ -471,7 +470,7 @@ export default async (client: Client) => {
[sourcePath],
createArgs,
org,
!project && !isFile,
!project,
path
);
@@ -654,8 +653,7 @@ export default async (client: Client) => {
client,
deployment,
deployStamp,
!argv['--no-clipboard'],
isFile
!argv['--no-clipboard']
);
};
@@ -790,8 +788,7 @@ const printDeploymentStatus = async (
};
},
deployStamp: () => string,
isClipboardEnabled: boolean,
isFile: boolean
isClipboardEnabled: boolean
) => {
indications = indications || [];
const isProdDeployment = target === 'production';
@@ -813,7 +810,7 @@ const printDeploymentStatus = async (
// print preview/production url
let previewUrl: string;
let isWildcard: boolean;
if (!isFile && Array.isArray(aliasList) && aliasList.length > 0) {
if (Array.isArray(aliasList) && aliasList.length > 0) {
const previewUrlInfo = await getPreferredPreviewURL(client, aliasList);
if (previewUrlInfo) {
isWildcard = previewUrlInfo.isWildcard;

View File

@@ -1,3 +1,5 @@
export type ProjectSettings = import('@vercel/build-utils').ProjectSettings;
export type Primitive =
| bigint
| boolean
@@ -239,16 +241,6 @@ export interface ProjectEnvVariable {
gitBranch?: string;
}
export interface ProjectSettings {
framework?: string | null;
devCommand?: string | null;
buildCommand?: string | null;
outputDirectory?: string | null;
rootDirectory?: string | null;
autoExposeSystemEnvs?: boolean;
directoryListing?: boolean;
}
export interface Project extends ProjectSettings {
id: string;
name: string;
@@ -260,8 +252,6 @@ export interface Project extends ProjectSettings {
framework?: string | null;
rootDirectory?: string | null;
latestDeployments?: Partial<Deployment>[];
autoExposeSystemEnvs?: boolean;
sourceFilesOutsideRootDirectory: boolean;
}
export interface Org {

View File

@@ -6,14 +6,14 @@ import getLocalConfigPath from './local-path';
export default async function readConfig(dir: string) {
const pkgFilePath = getLocalConfigPath(join(process.cwd(), dir));
const result = await readJSONFile(pkgFilePath);
const result = await readJSONFile<VercelConfig>(pkgFilePath);
if (result instanceof CantParseJSONFile) {
return result;
}
if (result) {
return result as VercelConfig;
return result;
}
return null;

View File

@@ -52,6 +52,7 @@ export default async function processDeployment({
isSettingUpProject: boolean;
skipAutoDetectionConfirmation?: boolean;
cwd?: string;
rootDirectory?: string;
}) {
let {
now,
@@ -64,6 +65,7 @@ export default async function processDeployment({
nowConfig,
quiet,
prebuilt,
rootDirectory,
} = args;
const { debug } = output;
@@ -86,6 +88,7 @@ export default async function processDeployment({
force,
withCache,
prebuilt,
rootDirectory,
skipAutoDetectionConfirmation,
};

View File

@@ -40,6 +40,7 @@ import {
detectApiExtensions,
spawnCommand,
isOfficialRuntime,
detectFileSystemAPI,
} from '@vercel/build-utils';
import frameworkList from '@vercel/frameworks';
@@ -599,6 +600,32 @@ export default class DevServer {
);
}
const { reason, metadata } = await detectFileSystemAPI({
files,
builders: builders || [],
projectSettings: projectSettings || this.projectSettings || {},
vercelConfig,
pkg,
tag: '',
enableFlag: true,
});
if (reason) {
if (metadata.hasMiddleware) {
this.output.error(
`Detected middleware usage which requires the latest API. ${reason}`
);
await this.exit();
} else if (metadata.plugins.length > 0) {
this.output.error(
`Detected CLI plugins which requires the latest API. ${reason}`
);
await this.exit();
} else {
this.output.warn(`Unable to use latest API. ${reason}`);
}
}
if (builders) {
if (this.devCommand) {
builders = builders.filter(filterFrontendBuilds);

View File

@@ -39,12 +39,12 @@ export default async function getConfig(
output.debug(
`Found config in provided --local-config path ${localFilePath}`
);
const localConfig = await readJSONFile(localFilePath);
const localConfig = await readJSONFile<VercelConfig>(localFilePath);
if (localConfig instanceof CantParseJSONFile) {
return localConfig;
}
if (localConfig !== null) {
config = localConfig as VercelConfig;
config = localConfig;
config[fileNameSymbol] = configFile;
return config;
}
@@ -54,8 +54,8 @@ export default async function getConfig(
const vercelFilePath = path.resolve(localPath, 'vercel.json');
const nowFilePath = path.resolve(localPath, 'now.json');
const [vercelConfig, nowConfig] = await Promise.all([
readJSONFile(vercelFilePath),
readJSONFile(nowFilePath),
readJSONFile<VercelConfig>(vercelFilePath),
readJSONFile<VercelConfig>(nowFilePath),
]);
if (vercelConfig instanceof CantParseJSONFile) {
return vercelConfig;
@@ -68,13 +68,13 @@ export default async function getConfig(
}
if (vercelConfig !== null) {
output.debug(`Found config in file "${vercelFilePath}"`);
config = vercelConfig as VercelConfig;
config = vercelConfig;
config[fileNameSymbol] = 'vercel.json';
return config;
}
if (nowConfig !== null) {
output.debug(`Found config in file "${nowFilePath}"`);
config = nowConfig as VercelConfig;
config = nowConfig;
config[fileNameSymbol] = 'now.json';
return config;
}

View File

@@ -4,14 +4,12 @@ import { VercelConfig } from '@vercel/client';
export interface GetProjectNameOptions {
argv: { '--name'?: string };
nowConfig?: VercelConfig;
isFile?: boolean;
paths?: string[];
}
export default function getProjectName({
argv,
nowConfig = {},
isFile = false,
paths = [],
}: GetProjectNameOptions) {
const nameCli = argv['--name'];
@@ -24,10 +22,6 @@ export default function getProjectName({
return nowConfig.name;
}
if (isFile || paths.length > 1) {
return 'files';
}
// Otherwise, use the name of the directory
return basename(paths[0] || '');
}

View File

@@ -30,13 +30,13 @@ export interface NowOptions {
export interface CreateOptions {
// Legacy
nowConfig?: VercelConfig;
isFile?: boolean;
// Latest
name: string;
project?: string;
wantsPublic: boolean;
prebuilt?: boolean;
rootDirectory?: string;
meta: Dictionary<string>;
regions?: string[];
quiet?: boolean;
@@ -113,6 +113,7 @@ export default class Now extends EventEmitter {
name,
project,
prebuilt = false,
rootDirectory,
wantsPublic,
meta,
regions,
@@ -168,6 +169,7 @@ export default class Now extends EventEmitter {
skipAutoDetectionConfirmation,
cwd,
prebuilt,
rootDirectory,
});
if (deployment && deployment.warnings) {

View File

@@ -11,7 +11,7 @@ export default async function inputProject(
client: Client,
org: Org,
detectedProjectName: string,
autoConfirm: boolean
autoConfirm = false
): Promise<Project | string> {
const { output } = client;
const slugifiedName = slugify(detectedProjectName);

View File

@@ -7,7 +7,7 @@ import { validateRootDirectory } from '../validate-paths';
export async function inputRootDirectory(
cwd: string,
output: Output,
autoConfirm: boolean
autoConfirm = false
) {
if (autoConfirm) {
return null;

View File

@@ -158,7 +158,6 @@ export default async function setupAndLink(
withCache: undefined,
quiet,
wantsPublic: localConfig?.public || false,
isFile,
nowConfig: localConfig,
regions: undefined,
meta: {},
@@ -179,7 +178,7 @@ export default async function setupAndLink(
[sourcePath],
createArgs,
org,
!isFile,
true,
path
);

View File

@@ -1,9 +1,9 @@
import fs from 'fs-extra';
import { CantParseJSONFile } from './errors-ts';
export default async function readJSONFile(
export default async function readJSONFile<T>(
file: string
): Promise<Object | null | CantParseJSONFile> {
): Promise<T | null | CantParseJSONFile> {
const content = await readFileSafe(file);
if (content === null) {
return content;

View File

@@ -4,7 +4,6 @@ import { Output } from './output';
import chalk from 'chalk';
import { homedir } from 'os';
import confirm from './input/confirm';
import { prependEmoji, emoji } from './emoji';
import toHumanPath from './humanize-path';
const stat = promisify(lstatRaw);
@@ -54,10 +53,7 @@ export async function validateRootDirectory(
export default async function validatePaths(
output: Output,
paths: string[]
): Promise<
| { valid: true; path: string; isFile: boolean }
| { valid: false; exitCode: number }
> {
): Promise<{ valid: true; path: string } | { valid: false; exitCode: number }> {
// can't deploy more than 1 path
if (paths.length > 1) {
output.print(`${chalk.red('Error!')} Can't deploy more than one path.\n`);
@@ -78,14 +74,12 @@ export default async function validatePaths(
return { valid: false, exitCode: 1 };
}
const isFile = pathStat && !pathStat.isDirectory();
if (isFile) {
output.print(
`${prependEmoji(
'Deploying files with Vercel is deprecated (https://vercel.link/faq-deploy-file)',
emoji('warning')
)}\n`
);
if (!pathStat.isDirectory()) {
output.prettyError({
message: 'Support for single file deployments has been removed.',
link: 'https://vercel.link/no-single-file-deployments',
});
return { valid: false, exitCode: 1 };
}
// ask confirmation if the directory is home
@@ -101,5 +95,5 @@ export default async function validatePaths(
}
}
return { valid: true, path, isFile };
return { valid: true, path };
}

View File

@@ -0,0 +1,60 @@
import { join } from 'path';
import { fileNameSymbol } from '@vercel/client';
import { client } from '../mocks/client';
import deploy from '../../src/commands/deploy';
describe('deploy', () => {
it('should reject deploying a single file', async () => {
client.setArgv('deploy', __filename);
const exitCode = await deploy(client);
expect(exitCode).toEqual(1);
expect(client.outputBuffer).toEqual(
`Error! Support for single file deployments has been removed.\nLearn More: https://vercel.link/no-single-file-deployments\n`
);
});
it('should reject deploying multiple files', async () => {
client.setArgv('deploy', __filename, join(__dirname, 'inspect.test.ts'));
const exitCode = await deploy(client);
expect(exitCode).toEqual(1);
expect(client.outputBuffer).toEqual(
`Error! Can't deploy more than one path.\n`
);
});
it('should reject deploying a directory that does not exist', async () => {
client.setArgv('deploy', 'does-not-exists');
const exitCode = await deploy(client);
expect(exitCode).toEqual(1);
expect(client.outputBuffer).toEqual(
`Error! The specified file or directory "does-not-exists" does not exist.\n`
);
});
it('should reject deploying "version: 1"', async () => {
client.setArgv('deploy');
client.localConfig = {
[fileNameSymbol]: 'vercel.json',
version: 1,
};
const exitCode = await deploy(client);
expect(exitCode).toEqual(1);
expect(client.outputBuffer).toEqual(
'Error! The value of the `version` property within vercel.json can only be `2`.\n'
);
});
it('should reject deploying "version: {}"', async () => {
client.setArgv('deploy');
client.localConfig = {
[fileNameSymbol]: 'vercel.json',
// @ts-ignore
version: {},
};
const exitCode = await deploy(client);
expect(exitCode).toEqual(1);
expect(client.outputBuffer).toEqual(
'Error! The `version` property inside your vercel.json file must be a number.\n'
);
});
});

View File

@@ -1913,59 +1913,6 @@ test('create a production deployment', async t => {
t.is(deployment.target, 'production', JSON.stringify(deployment, null, 2));
});
test('deploying a file should not show prompts and display deprecation', async t => {
const file = fixture('static-single-file/first.png');
const output = await execute([file], {
reject: false,
});
const { stdout, stderr, exitCode } = output;
// Ensure the exit code is right
t.is(exitCode, 0, formatOutput(output));
t.true(stderr.includes('Deploying files with Vercel is deprecated'));
// Ensure `.vercel` was not created
t.is(
await exists(path.join(path.dirname(file), '.vercel')),
false,
'.vercel should not exists'
);
// Test if the output is really a URL
const { href, host } = new URL(stdout);
t.is(host.split('-')[0], 'files');
// Send a test request to the deployment
const response = await fetch(href);
const contentType = response.headers.get('content-type');
t.is(contentType, 'image/png');
t.deepEqual(await readFile(file), await response.buffer());
});
test('deploying more than 1 path should fail', async t => {
const file1 = fixture('static-multiple-files/first.png');
const file2 = fixture('static-multiple-files/second.png');
const { stdout, stderr, exitCode } = await execa(
binaryPath,
[file1, file2, '--public', '--name', session, ...defaultArgs, '--confirm'],
{
reject: false,
}
);
console.log(stderr);
console.log(stdout);
console.log(exitCode);
// Ensure the exit code is right
t.is(exitCode, 1);
t.true(stderr.trim().endsWith(`Can't deploy more than one path.`));
});
test('use build-env', async t => {
const directory = fixture('build-env');

View File

@@ -10,7 +10,7 @@ describe('getProjectName', () => {
expect(project).toEqual('abc');
});
it('should work with now.json', () => {
it('should work with `vercel.json` config', () => {
const project = getProjectName({
argv: {},
nowConfig: { name: 'abc' },
@@ -18,24 +18,6 @@ describe('getProjectName', () => {
expect(project).toEqual('abc');
});
it('should work with a file', () => {
const project = getProjectName({
argv: {},
nowConfig: {},
isFile: true,
});
expect(project).toEqual('files');
});
it('should work with a multiple files', () => {
const project = getProjectName({
argv: {},
nowConfig: {},
paths: ['/tmp/aa/abc.png', '/tmp/aa/bbc.png'],
});
expect(project).toEqual('files');
});
it('should work with a directory', () => {
const project = getProjectName({
argv: {},

View File

@@ -6,3 +6,5 @@ node_modules
!tests/fixtures/nowignore/node_modules
!tests/fixtures/vercelignore-allow-nodemodules/node_modules
!tests/fixtures/vercelignore-allow-nodemodules/sub/node_modules
!tests/fixtures/file-system-api/.output
!tests/fixtures/file-system-api-root-directory/**/.output

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/client",
"version": "10.2.3-canary.38",
"version": "10.2.3-canary.46",
"main": "dist/index.js",
"typings": "dist/index.d.ts",
"homepage": "https://vercel.com",
@@ -40,7 +40,7 @@
]
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.37",
"@vercel/build-utils": "2.12.3-canary.43",
"@zeit/fetch": "5.2.0",
"async-retry": "1.2.3",
"async-sema": "3.0.0",

View File

@@ -1,12 +1,12 @@
import { lstatSync } from 'fs-extra';
import { relative, isAbsolute } from 'path';
import hashes, { mapToObject } from './utils/hashes';
import { hashes, mapToObject, resolveNftJsonFiles } from './utils/hashes';
import { upload } from './upload';
import { buildFileTree, createDebug, parseVercelConfig } from './utils';
import { DeploymentError } from './errors';
import {
NowConfig,
VercelConfig,
VercelClientOptions,
DeploymentOptions,
DeploymentEventType,
@@ -16,7 +16,7 @@ export default function buildCreateDeployment() {
return async function* createDeployment(
clientOptions: VercelClientOptions,
deploymentOptions: DeploymentOptions = {},
nowConfig: NowConfig = {}
nowConfig: VercelConfig = {}
): AsyncIterableIterator<{ type: DeploymentEventType; payload: any }> {
const { path } = clientOptions;
@@ -74,12 +74,7 @@ export default function buildCreateDeployment() {
debug(`Provided 'path' is a single file`);
}
let { fileList } = await buildFileTree(
path,
clientOptions.isDirectory,
debug,
clientOptions.prebuilt
);
let { fileList } = await buildFileTree(path, clientOptions, debug);
let configPath: string | undefined;
if (!nowConfig) {
@@ -114,7 +109,11 @@ export default function buildCreateDeployment() {
};
}
const files = await hashes(fileList);
const hashedFileMap = await hashes(fileList);
const nftFileList = clientOptions.prebuilt
? await resolveNftJsonFiles(hashedFileMap)
: [];
const files = await hashes(nftFileList, hashedFileMap);
debug(`Yielding a 'hashes-calculated' event with ${files.size} hashes`);
yield { type: 'hashes-calculated', payload: mapToObject(files) };

View File

@@ -1,4 +1,8 @@
import { Builder, BuilderFunctions } from '@vercel/build-utils';
import {
Builder,
BuilderFunctions,
ProjectSettings,
} from '@vercel/build-utils';
import { Header, Route, Redirect, Rewrite } from '@vercel/routing-utils';
export { DeploymentEventType } from './utils';
@@ -15,6 +19,7 @@ export interface VercelClientOptions {
apiUrl?: string;
force?: boolean;
prebuilt?: boolean;
rootDirectory?: string;
withCache?: boolean;
userAgent?: string;
defaultName?: string;
@@ -123,12 +128,7 @@ export interface VercelConfig {
scope?: string;
alias?: string | string[];
regions?: string[];
projectSettings?: {
devCommand?: string | null;
buildCommand?: string | null;
outputDirectory?: string | null;
framework?: string | null;
};
projectSettings?: ProjectSettings;
}
/**
@@ -154,9 +154,5 @@ export interface DeploymentOptions {
name?: string;
public?: boolean;
meta?: Dictionary<string>;
projectSettings?: {
devCommand?: string | null;
buildCommand?: string | null;
outputDirectory?: string | null;
};
projectSettings?: ProjectSettings;
}

View File

@@ -1,6 +1,7 @@
import { createHash } from 'crypto';
import fs from 'fs-extra';
import { Sema } from 'async-sema';
import { join, dirname } from 'path';
export interface DeploymentFile {
names: string[];
@@ -15,9 +16,7 @@ export interface DeploymentFile {
* @return {String} hex digest
*/
function hash(buf: Buffer): string {
return createHash('sha1')
.update(buf)
.digest('hex');
return createHash('sha1').update(buf).digest('hex');
}
/**
@@ -39,34 +38,68 @@ export const mapToObject = (
/**
* Computes hashes for the contents of each file given.
*
* @param {Array} of {String} full paths
* @return {Map}
* @param files - absolute file paths
* @param map - optional map of files to append
* @return Map of hash digest to file object
*/
async function hashes(files: string[]): Promise<Map<string, DeploymentFile>> {
const map = new Map<string, DeploymentFile>();
export async function hashes(
files: string[],
map = new Map<string, DeploymentFile>()
): Promise<Map<string, DeploymentFile>> {
const semaphore = new Sema(100);
await Promise.all(
files.map(
async (name: string): Promise<void> => {
await semaphore.acquire();
const data = await fs.readFile(name);
const { mode } = await fs.stat(name);
files.map(async (name: string): Promise<void> => {
await semaphore.acquire();
const data = await fs.readFile(name);
const { mode } = await fs.stat(name);
const h = hash(data);
const entry = map.get(h);
const h = hash(data);
const entry = map.get(h);
if (entry) {
entry.names.push(name);
} else {
map.set(h, { names: [name], data, mode });
}
semaphore.release();
if (entry) {
const names = new Set(entry.names);
names.add(name);
entry.names = [...names];
} else {
map.set(h, { names: [name], data, mode });
}
)
semaphore.release();
})
);
return map;
}
export default hashes;
export async function resolveNftJsonFiles(
hashedFiles: Map<string, DeploymentFile>
): Promise<string[]> {
const semaphore = new Sema(100);
const existingFiles = Array.from(hashedFiles.values());
const resolvedFiles = new Set<string>();
await Promise.all(
existingFiles.map(async file => {
await semaphore.acquire();
const fsPath = file.names[0];
if (fsPath.endsWith('.nft.json')) {
const json = file.data.toString('utf8');
const { version, files } = JSON.parse(json) as {
version: number;
files: string[] | { input: string; output: string }[];
};
if (version === 1 || version === 2) {
for (let f of files) {
const relPath = typeof f === 'string' ? f : f.input;
resolvedFiles.add(join(dirname(fsPath), relPath));
}
} else {
console.error(`Invalid nft.json version: ${version}`);
}
}
semaphore.release();
})
);
return Array.from(resolvedFiles);
}

View File

@@ -1,7 +1,7 @@
import { DeploymentFile } from './hashes';
import { FetchOptions } from '@zeit/fetch';
import { nodeFetch, zeitFetch } from './fetch';
import { join, sep, relative } from 'path';
import { join, sep, relative, posix } from 'path';
import { URL } from 'url';
import ignore from 'ignore';
type Ignore = ReturnType<typeof ignore>;
@@ -81,13 +81,16 @@ const maybeRead = async function <T>(path: string, default_: T) {
export async function buildFileTree(
path: string | string[],
isDirectory: boolean,
debug: Debug,
prebuilt?: boolean
{
isDirectory,
prebuilt,
rootDirectory,
}: Pick<VercelClientOptions, 'isDirectory' | 'prebuilt' | 'rootDirectory'>,
debug: Debug
): Promise<{ fileList: string[]; ignoreList: string[] }> {
const ignoreList: string[] = [];
let fileList: string[];
let { ig, ignores } = await getVercelIgnore(path, prebuilt);
let { ig, ignores } = await getVercelIgnore(path, prebuilt, rootDirectory);
debug(`Found ${ignores.length} rules in .vercelignore`);
debug('Building file tree...');
@@ -119,37 +122,50 @@ export async function buildFileTree(
export async function getVercelIgnore(
cwd: string | string[],
prebuilt?: boolean
prebuilt?: boolean,
rootDirectory?: string
): Promise<{ ig: Ignore; ignores: string[] }> {
const ignores: string[] = prebuilt
? ['*', '!.output', '!.output/**']
: [
'.hg',
'.git',
'.gitmodules',
'.svn',
'.cache',
'.next',
'.now',
'.vercel',
'.npmignore',
'.dockerignore',
'.gitignore',
'.*.swp',
'.DS_Store',
'.wafpicke-*',
'.lock-wscript',
'.env.local',
'.env.*.local',
'.venv',
'npm-debug.log',
'config.gypi',
'node_modules',
'__pycache__',
'venv',
'CVS',
'.output',
];
let ignores: string[] = [];
const outputDir = posix.join(rootDirectory || '', '.output');
if (prebuilt) {
ignores.push('*');
const parts = outputDir.split('/');
parts.forEach((_, i) => {
const level = parts.slice(0, i + 1).join('/');
ignores.push(`!${level}`);
});
ignores.push(`!${outputDir}/**`);
} else {
ignores = [
'.hg',
'.git',
'.gitmodules',
'.svn',
'.cache',
'.next',
'.now',
'.vercel',
'.npmignore',
'.dockerignore',
'.gitignore',
'.*.swp',
'.DS_Store',
'.wafpicke-*',
'.lock-wscript',
'.env.local',
'.env.*.local',
'.venv',
'npm-debug.log',
'config.gypi',
'node_modules',
'__pycache__',
'venv',
'CVS',
`.output`,
];
}
const cwds = Array.isArray(cwd) ? cwd : [cwd];
const files = await Promise.all(
@@ -250,39 +266,31 @@ export const prepareFiles = (
files: Map<string, DeploymentFile>,
clientOptions: VercelClientOptions
): PreparedFile[] => {
const preparedFiles = [...files.keys()].reduce(
(acc: PreparedFile[], sha: string): PreparedFile[] => {
const next = [...acc];
const preparedFiles: PreparedFile[] = [];
for (const [sha, file] of files) {
for (const name of file.names) {
let fileName: string;
const file = files.get(sha) as DeploymentFile;
for (const name of file.names) {
let fileName: string;
if (clientOptions.isDirectory) {
// Directory
fileName =
typeof clientOptions.path === 'string'
? relative(clientOptions.path, name)
: name;
} else {
// Array of files or single file
const segments = name.split(sep);
fileName = segments[segments.length - 1];
}
next.push({
file: isWin ? fileName.replace(/\\/g, '/') : fileName,
size: file.data.byteLength || file.data.length,
mode: file.mode,
sha,
});
if (clientOptions.isDirectory) {
// Directory
fileName =
typeof clientOptions.path === 'string'
? relative(clientOptions.path, name)
: name;
} else {
// Array of files or single file
const segments = name.split(sep);
fileName = segments[segments.length - 1];
}
return next;
},
[]
);
preparedFiles.push({
file: isWin ? fileName.replace(/\\/g, '/') : fileName,
size: file.data.byteLength || file.data.length,
mode: file.mode,
sha,
});
}
}
return preparedFiles;
};

View File

@@ -0,0 +1 @@
foo

View File

@@ -0,0 +1 @@
bar

View File

@@ -0,0 +1 @@
bar

View File

@@ -0,0 +1 @@
baz

View File

@@ -0,0 +1 @@
qux

View File

@@ -0,0 +1 @@
foo

View File

@@ -0,0 +1 @@
bar

View File

@@ -0,0 +1,4 @@
{
"extends": "../tsconfig.json",
"include": ["*.test.ts"]
}

View File

@@ -17,7 +17,11 @@ const toAbsolutePaths = (cwd: string, files: string[]) =>
describe('buildFileTree()', () => {
it('should exclude files using `.nowignore` blocklist', async () => {
const cwd = fixture('nowignore');
const { fileList, ignoreList } = await buildFileTree(cwd, true, noop);
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true },
noop
);
const expectedFileList = toAbsolutePaths(cwd, ['.nowignore', 'index.txt']);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
@@ -36,7 +40,11 @@ describe('buildFileTree()', () => {
it('should include the node_modules using `.vercelignore` allowlist', async () => {
const cwd = fixture('vercelignore-allow-nodemodules');
const { fileList, ignoreList } = await buildFileTree(cwd, true, noop);
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true },
noop
);
const expected = toAbsolutePaths(cwd, [
'node_modules/one.txt',
@@ -54,4 +62,90 @@ describe('buildFileTree()', () => {
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find root files but ignore .output files when prebuilt=false', async () => {
const cwd = fixture('file-system-api');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: false },
noop
);
const expectedFileList = toAbsolutePaths(cwd, ['foo.txt', 'sub/bar.txt']);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['.output'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find .output files but ignore other files when prebuilt=true', async () => {
const cwd = fixture('file-system-api');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: true },
noop
);
const expectedFileList = toAbsolutePaths(cwd, [
'.output/baz.txt',
'.output/sub/qux.txt',
]);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['foo.txt', 'sub'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find root files but ignore all .output files when prebuilt=false and rootDirectory=root', async () => {
const cwd = fixture('file-system-api-root-directory');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: false, rootDirectory: 'root' },
noop
);
const expectedFileList = toAbsolutePaths(cwd, [
'foo.txt',
'root/bar.txt',
'someother/bar.txt',
]);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['root/.output', 'someother/.output'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find root/.output files but ignore other files when prebuilt=true and rootDirectory=root', async () => {
const cwd = fixture('file-system-api-root-directory');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: true, rootDirectory: 'root' },
noop
);
const expectedFileList = toAbsolutePaths(cwd, [
'root/.output/baz.txt',
'root/.output/sub/qux.txt',
]);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['foo.txt', 'root/bar.txt', 'someother'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
});

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/frameworks",
"version": "0.5.1-canary.16",
"version": "0.5.1-canary.17",
"main": "./dist/frameworks.js",
"types": "./dist/frameworks.d.ts",
"files": [

View File

@@ -141,7 +141,6 @@ export const frameworks = [
},
dependency: 'gatsby',
getOutputDirName: async () => 'public',
getFsOutputDir: async () => 'public',
defaultRoutes: async (dirPrefix: string) => {
// This file could be generated by gatsby-plugin-now or gatsby-plugin-zeit-now
try {
@@ -226,7 +225,6 @@ export const frameworks = [
},
},
dependency: 'remix',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultRoutes: [
{
@@ -254,10 +252,13 @@ export const frameworks = [
source: '/build/(.*)',
regex: '/build/(.*)',
headers: [
{ key: 'cache-control', value: 'public, max-age=31536000, immutable' },
{
key: 'cache-control',
value: 'public, max-age=31536000, immutable',
},
],
},
]
],
},
{
name: 'Hexo',
@@ -294,7 +295,6 @@ export const frameworks = [
},
},
dependency: 'hexo',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
},
{
@@ -332,7 +332,6 @@ export const frameworks = [
},
},
dependency: '@11ty/eleventy',
getFsOutputDir: async () => '_site',
getOutputDirName: async () => '_site',
cachePattern: '.cache/**',
},
@@ -372,22 +371,6 @@ export const frameworks = [
},
},
dependency: '@docusaurus/core',
getFsOutputDir: async (dirPrefix: string) => {
const base = 'build';
try {
const location = join(dirPrefix, base);
const content = await readdir(location, { withFileTypes: true });
// If there is only one file in it that is a dir we'll use it as dist dir
if (content.length === 1 && content[0].isDirectory()) {
return join(base, content[0].name);
}
} catch (error) {
console.error(`Error detecting output directory: `, error);
}
return base;
},
getOutputDirName: async (dirPrefix: string) => {
const base = 'build';
try {
@@ -527,21 +510,6 @@ export const frameworks = [
},
},
dependency: 'docusaurus',
getFsOutputDir: async (dirPrefix: string) => {
const base = 'build';
try {
const location = join(dirPrefix, base);
const content = await readdir(location, { withFileTypes: true });
// If there is only one file in it that is a dir we'll use it as dist dir
if (content.length === 1 && content[0].isDirectory()) {
return join(base, content[0].name);
}
} catch (error) {
console.error(`Error detecting output directory: `, error);
}
return base;
},
getOutputDirName: async (dirPrefix: string) => {
const base = 'build';
try {
@@ -593,7 +561,6 @@ export const frameworks = [
},
},
dependency: 'preact-cli',
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
defaultRoutes: [
{
@@ -650,7 +617,6 @@ export const frameworks = [
},
},
dependency: '@dojo/cli',
getFsOutputDir: async () => 'output/dist',
getOutputDirName: async () => join('output', 'dist'),
defaultRoutes: [
{
@@ -717,7 +683,6 @@ export const frameworks = [
},
},
dependency: 'ember-cli',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{
@@ -772,7 +737,6 @@ export const frameworks = [
},
},
dependency: '@vue/cli-service',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{
@@ -849,7 +813,6 @@ export const frameworks = [
},
},
dependency: '@scullyio/init',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist/static',
},
{
@@ -886,7 +849,6 @@ export const frameworks = [
},
},
dependency: '@ionic/angular',
getFsOutputDir: async () => 'www',
getOutputDirName: async () => 'www',
defaultRoutes: [
{
@@ -940,7 +902,6 @@ export const frameworks = [
},
},
dependency: '@angular/cli',
getFsOutputDir: async () => 'dist',
getOutputDirName: async (dirPrefix: string) => {
const base = 'dist';
try {
@@ -1008,7 +969,6 @@ export const frameworks = [
},
},
dependency: 'polymer-cli',
getFsOutputDir: async () => 'build',
getOutputDirName: async (dirPrefix: string) => {
const base = 'build';
try {
@@ -1078,7 +1038,6 @@ export const frameworks = [
},
},
dependency: 'sirv-cli',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultRoutes: [
{
@@ -1128,10 +1087,9 @@ export const frameworks = [
placeholder: 'svelte-kit dev',
},
outputDirectory: {
placeholder: 'public',
value: 'public',
},
},
getFsOutputDir: async () => '.output',
getOutputDirName: async () => 'public',
},
{
@@ -1168,7 +1126,6 @@ export const frameworks = [
},
},
dependency: '@ionic/react',
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
defaultRoutes: [
{
@@ -1276,7 +1233,6 @@ export const frameworks = [
},
},
dependency: 'react-scripts',
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
defaultRoutes: [
{
@@ -1378,7 +1334,6 @@ export const frameworks = [
},
},
dependency: 'gridsome',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
},
{
@@ -1416,7 +1371,6 @@ export const frameworks = [
},
},
dependency: 'umi',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{
@@ -1470,7 +1424,6 @@ export const frameworks = [
},
},
dependency: 'sapper',
getFsOutputDir: async () => '__sapper__/export',
getOutputDirName: async () => '__sapper__/export',
},
{
@@ -1508,7 +1461,6 @@ export const frameworks = [
},
},
dependency: 'saber',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultRoutes: [
{
@@ -1577,7 +1529,6 @@ export const frameworks = [
},
},
dependency: '@stencil/core',
getFsOutputDir: async () => 'www',
getOutputDirName: async () => 'www',
defaultRoutes: [
{
@@ -1666,7 +1617,6 @@ export const frameworks = [
},
},
dependency: 'nuxt',
getFsOutputDir: async () => '.output',
getOutputDirName: async () => 'dist',
cachePattern: '.nuxt/**',
defaultRoutes: [
@@ -1724,7 +1674,6 @@ export const frameworks = [
placeholder: 'RedwoodJS default',
},
},
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
},
{
@@ -1768,16 +1717,6 @@ export const frameworks = [
placeholder: '`public` or `publishDir` from the `config` file',
},
},
getFsOutputDir: async (dirPrefix: string): Promise<string> => {
type HugoConfig = { publishDir?: string };
const config = await readConfigFile<HugoConfig>(
['config.json', 'config.yaml', 'config.toml'].map(fileName => {
return join(dirPrefix, fileName);
})
);
return (config && config.publishDir) || 'public';
},
getOutputDirName: async (dirPrefix: string): Promise<string> => {
type HugoConfig = { publishDir?: string };
const config = await readConfigFile<HugoConfig>(
@@ -1822,13 +1761,6 @@ export const frameworks = [
placeholder: '`_site` or `destination` from `_config.yml`',
},
},
getFsOutputDir: async (dirPrefix: string): Promise<string> => {
type JekyllConfig = { destination?: string };
const config = await readConfigFile<JekyllConfig>(
join(dirPrefix, '_config.yml')
);
return (config && config.destination) || '_site';
},
getOutputDirName: async (dirPrefix: string): Promise<string> => {
type JekyllConfig = { destination?: string };
const config = await readConfigFile<JekyllConfig>(
@@ -1870,7 +1802,6 @@ export const frameworks = [
value: 'public',
},
},
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
},
{
@@ -1905,7 +1836,6 @@ export const frameworks = [
value: 'build',
},
},
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
cachePattern: '{vendor/bin,vendor/cache,vendor/bundle}/**',
},
@@ -1940,7 +1870,6 @@ export const frameworks = [
value: 'public',
},
},
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultVersion: '0.13.0',
},
@@ -1980,7 +1909,6 @@ export const frameworks = [
},
},
dependency: 'vite',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
},
{
@@ -2018,7 +1946,6 @@ export const frameworks = [
},
},
dependency: 'parcel',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{

View File

@@ -162,9 +162,9 @@ export interface Framework {
dependency?: string;
/**
* Function that returns the name of the directory that the framework outputs
* its build results to. In some cases this is read from a configuration file.
* its File System API build results to, usually called `.output`.
*/
getFsOutputDir: (dirPrefix: string) => Promise<string>;
getFsOutputDir?: (dirPrefix: string) => Promise<string>;
/**
* Function that returns the name of the directory that the framework outputs
* its STATIC build results to. In some cases this is read from a configuration file.

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-middleware",
"version": "0.0.0-canary.13",
"version": "0.0.0-canary.20",
"license": "MIT",
"main": "./dist/index",
"homepage": "",
@@ -30,7 +30,7 @@
"@types/node-fetch": "^2",
"@types/ua-parser-js": "0.7.36",
"@types/uuid": "8.3.1",
"@vercel/build-utils": "2.12.3-canary.37",
"@vercel/build-utils": "2.12.3-canary.43",
"@vercel/ncc": "0.24.0",
"cookie": "0.4.1",
"formdata-node": "4.3.1",

View File

@@ -0,0 +1,52 @@
import path from 'path';
import * as esbuild from 'esbuild';
const processInjectFile = `
// envOverride is passed by esbuild plugin
const env = envOverride
function cwd() {
return '/'
}
function chdir(dir) {
throw new Error('process.chdir is not supported')
}
export const process = {
argv: [],
env,
chdir,
cwd,
};
`;
export function nodeProcessPolyfillPlugin({ env = {} } = {}): esbuild.Plugin {
return {
name: 'node-process-polyfill',
setup({ initialOptions, onResolve, onLoad }) {
onResolve({ filter: /_virtual-process-polyfill_\.js/ }, ({ path }) => {
return {
path,
sideEffects: false,
};
});
onLoad({ filter: /_virtual-process-polyfill_\.js/ }, () => {
const contents = `const envOverride = ${JSON.stringify(
env
)};\n${processInjectFile}`;
return {
loader: 'js',
contents,
};
});
const polyfills = [
path.resolve(__dirname, '_virtual-process-polyfill_.js'),
];
if (initialOptions.inject) {
initialOptions.inject.push(...polyfills);
} else {
initialOptions.inject = [...polyfills];
}
},
};
}

View File

@@ -17,6 +17,7 @@ import {
UrlWithParsedQuery,
} from 'url';
import { toNodeHeaders } from './websandbox/utils';
import { nodeProcessPolyfillPlugin } from './esbuild-plugins';
const glob = util.promisify(libGlob);
const SUPPORTED_EXTENSIONS = ['.js', '.ts'];
@@ -80,6 +81,7 @@ export async function build({ workPath }: { workPath: string }) {
banner: {
js: '"use strict";',
},
plugins: [nodeProcessPolyfillPlugin({ env: process.env })],
format: 'cjs',
});
// Create `_ENTRIES` wrapper

View File

@@ -15,6 +15,6 @@ Object {
"sortingIndex": 1,
},
},
"version": 1,
"version": 2,
}
`;

View File

@@ -63,6 +63,32 @@ describe('build()', () => {
).toEqual('1');
});
it('should build simple middleware with env vars', async () => {
const expectedEnvVar = 'expected-env-var';
const fixture = join(__dirname, 'fixtures/env');
process.env.ENV_VAR_SHOULD_BE_DEFINED = expectedEnvVar;
await build({
workPath: fixture,
});
// env var should be inlined in the output
delete process.env.ENV_VAR_SHOULD_BE_DEFINED;
const outputFile = join(fixture, '.output/server/pages/_middleware.js');
expect(await fsp.stat(outputFile)).toBeTruthy();
require(outputFile);
//@ts-ignore
const middleware = global._ENTRIES['middleware_pages/_middleware'].default;
expect(typeof middleware).toStrictEqual('function');
const handledResponse = await middleware({
request: {},
});
expect(String(handledResponse.response.body)).toEqual(expectedEnvVar);
expect(
(handledResponse.response as Response).headers.get('x-middleware-next')
).toEqual(null);
});
it('should create a middleware that runs in strict mode', async () => {
const { middleware } = await setupFixture('use-strict');
const response = await middleware({

View File

@@ -0,0 +1,3 @@
export default req => {
return new Response(process.env.ENV_VAR_SHOULD_BE_DEFINED);
};

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-go",
"version": "1.0.0-canary.25",
"version": "1.0.0-canary.31",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,7 +17,7 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.37",
"@vercel/build-utils": "2.12.3-canary.43",
"@vercel/go": "1.2.4-canary.4"
},
"devDependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-node",
"version": "1.12.2-canary.29",
"version": "1.12.2-canary.35",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/node-js",
@@ -34,7 +34,7 @@
"@types/node-fetch": "2",
"@types/test-listen": "1.1.0",
"@types/yazl": "2.4.2",
"@vercel/build-utils": "2.12.3-canary.37",
"@vercel/build-utils": "2.12.3-canary.43",
"@vercel/fun": "1.0.3",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.14.0",

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-python",
"version": "1.0.0-canary.26",
"version": "1.0.0-canary.32",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,8 +17,8 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.37",
"@vercel/python": "2.1.2-canary.1"
"@vercel/build-utils": "2.12.3-canary.43",
"@vercel/python": "2.1.2-canary.2"
},
"devDependencies": {
"@types/node": "*",

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-ruby",
"version": "1.0.0-canary.25",
"version": "1.0.0-canary.31",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,7 +17,7 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.37",
"@vercel/build-utils": "2.12.3-canary.43",
"@vercel/ruby": "1.2.10-canary.0"
},
"devDependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/python",
"version": "2.1.2-canary.1",
"version": "2.1.2-canary.2",
"main": "./dist/index.js",
"license": "MIT",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/python",

View File

@@ -1,4 +1,3 @@
import { relative, basename } from 'path';
import execa from 'execa';
import { Meta, debug } from '@vercel/build-utils';
@@ -136,17 +135,10 @@ export async function installRequirementsFile({
meta,
args = [],
}: InstallRequirementsFileArg) {
const fileAtRoot = relative(workPath, filePath) === basename(filePath);
// If the `requirements.txt` file is located in the Root Directory of the project and
// the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed its dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall && fileAtRoot) {
debug(
`Skipping requirements file installation, already installed by Install Command`
);
return;
}
// The Vercel platform already handles `requirements.txt` for frontend projects,
// but the installation logic there is different, because it seems to install all
// of the dependencies globally, whereas, for this Runtime, we want it to happen only
// locally, so we'll run a separate installation.
if (
meta.isDev &&