Compare commits

...

103 Commits

Author SHA1 Message Date
chloetedder
ada9a48d57 Publish Stable
- @vercel/build-utils@6.0.1
 - vercel@28.14.1
 - @vercel/client@12.3.4
 - @vercel/fs-detectors@3.7.7
 - @vercel/gatsby-plugin-vercel-builder@1.0.3
 - @vercel/go@2.3.0
 - @vercel/hydrogen@0.0.46
 - @vercel/next@3.3.20
 - @vercel/node@2.8.17
 - @vercel/python@3.1.42
 - @vercel/redwood@1.0.53
 - @vercel/remix@1.2.9
 - @vercel/ruby@1.3.58
 - @vercel/static-build@1.3.1
2023-01-30 15:19:54 -07:00
chloetedder
31f3daa5b4 [fs-detectors] getMonorepoDefaultSettings: Fix settings (#9315)
1. `commandForIgnoringBuildStep` should be run at the project directory level not the monorepo root level
2. Simplifying the `installCommand` because doesn't need the relative root unless it is `npm`
2023-01-30 18:35:42 +00:00
Steven
fdf7fd6784 [go] Add support for Go 1.19 (#9343)
![image](https://user-images.githubusercontent.com/229881/215526900-fef3e4e0-d1a6-4327-b68f-d8bb9068e2a0.png)
2023-01-30 17:08:10 +00:00
Bryan Eaton
db216f903f [tests] Fix github action restore cache timeout (#9342)
Github Action for actions/cache has an issue that could cause it to take
1 hour and then timeout while trying to restore the cache.

Since `actions/setup-node` is no longer used to restore the pnpm cache,
the `timeout-minutes: 5` doesn't do anything for this action.

Instead we need to put an env variable on the `actions/cache` Github
Action to actually set the restore cache timeout.

```
env:
  SEGMENT_DOWNLOAD_TIMEOUT_MINS: 5 # See https://github.com/actions/cache/issues/810
```

- See discussion here https://github.com/vercel/vercel/discussions/9340
- Related to actions/cache#810
- Follow up to #8639
2023-01-30 10:42:00 -05:00
Nathan Rajlich
40a55b11d5 [gatsby-plugin-vercel-builder] Downgrade esbuild to v0.14.47 (#9329)
So that there will only be one copy in the dependency tree between `@vercel/node` and `@vercel/gatsby-plugin-vercel-builder`
2023-01-28 17:03:27 +00:00
Nathan Rajlich
5da537124c [static-build] Add integration test for Gatsby Function (#9330) 2023-01-27 14:00:29 -08:00
JJ Kasper
54aaab83aa [docs] Replace references to @now/next in error docs (#9331)
It's been long enough I think we can change these to refer to
`@vercel/next` instead.

x-ref: [slack
thread](https://vercel.slack.com/archives/C03DQ3QFV7C/p1674849801196389)
2023-01-27 16:59:42 -05:00
Chris Barber
7110cb449b [go] Improved 'go build' error handling (#9320)
The `go` builder doesn't properly handle when `go build` fails. There are 2 failure scenarios:

1. The `go` binary is not found in the `PATH`. There is no message that the `go` executable was not found. Worse, the builder continues and fails trying to run the compiled executable:

![image](https://user-images.githubusercontent.com/97262/214765157-f5244467-7bc8-48e8-8fcf-385efc32b283.png)

The solution is to switch from `spawnSync()` to `execFileSync()` so that if `go` is not found, it will throw like this:

<img width="507" alt="image" src="https://user-images.githubusercontent.com/97262/214765388-792ce59a-79f5-4bc8-8088-472969729558.png">
Note: I temporarily changed the code to spawn `gooooo` to simulate `go` not being found. Either way, `execFileSync()` will throw and we get a helpful message.

2. The `go build` fails due to some actual build error. This should never happen, but better safe than sorry.

The solution here is to allow `go build` to inherit stdout/stderr and if `go build` should fail, we can see the output:

<img width="574" alt="image" src="https://user-images.githubusercontent.com/97262/214765614-c414f160-ad8e-4bd2-a628-b5a91dcc88d8.png">
2023-01-27 20:07:03 +00:00
Chris Barber
89a15681d5 [build-utils] Rewrite rename() to be more efficient (#9322)
Around 6 months ago, @styfle brought to my attention how `rename()` in build-utils used `reduce()` and could be written better. So, I rewrote it.

Before, the code would create a new `Files` object and copy the contents of the previous `Files` object. This caused heavy garbage collection and memory thrashing.

Instead, I created a single `Files` object, then add the files to it.

Results:

| # Files | Before | After |
|---|---|---|
| 1,000 | 75 ms | 1 ms |
| 10,000 | 10.6 s | 7 ms |
| 20,000 | 44.6 s | 16 ms |
| 30,000 | 105.5 s | 22 ms |
| 100,000 | Too long | 73 ms |
2023-01-27 18:21:59 +00:00
Ethan Arrowood
9317543c48 Publish Stable
- vercel@28.14.0
 - @vercel/gatsby-plugin-vercel-builder@1.0.2
 - @vercel/static-build@1.3.0
2023-01-27 10:12:32 -07:00
Nathan Rajlich
e3f326f714 [gatsby-plugin-vercel-builder] Don't delete .vercel/output (#9327)
`.vercel/output` is already made fresh when running `vc build`, so the
plugin should not be doing this. In fact, it makes the `builds.json`
file be wiped away, which we don't want to happen.
2023-01-27 09:08:22 -08:00
Ethan Arrowood
eed6a377f1 [static-build] Cleanup gatsby file changes after static-build completes (#9312)
Adds a new function that restores the user's `gatsby-config.(j|t|mj)s` and `gatsby-node.(j|t|mj)s` files
2023-01-27 16:23:02 +00:00
Nathan Rajlich
c16d9e6784 [static-build] Include Gatsby plugins as regular dependencies (#9313)
Right now, static-build will add the necessary Gatsby plugins to the project's `package.json` at build-time, which has been bothersome for package managers when using a frozen lockfile.

Another issue with it is that we install `latest` version of the plugin, so the version used becomes disjointed from the CLI version itself, which leads to unpredictability when trying to debug issues or help users roll back to a previous behavior if something breaks.

So instead of patching `package.json` directly, include the plugins as deps of static-build itself, and create symlinks to those paths into the project's `node_modules` directory.
2023-01-26 20:15:58 +00:00
Steven
25f6595d36 Publish Stable
- @vercel/build-utils@6.0.0
 - vercel@28.13.2
 - @vercel/client@12.3.3
 - @vercel/edge@0.2.7
 - @vercel/frameworks@1.3.0
 - @vercel/fs-detectors@3.7.6
 - @vercel/gatsby-plugin-vercel-builder@1.0.1
 - @vercel/go@2.2.31
 - @vercel/hydrogen@0.0.45
 - @vercel/next@3.3.19
 - @vercel/node@2.8.16
 - @vercel/python@3.1.41
 - @vercel/redwood@1.0.52
 - @vercel/remix@1.2.8
 - @vercel/ruby@1.3.57
 - @vercel/static-build@1.2.1
2023-01-26 11:19:03 -05:00
JJ Kasper
e8385566fa [next] Ensure we warn when leveraging next export (#9319)
We currently don't make it obvious when `next export` is being leveraged which de-opts features like `middleware`, `rewrites`, `redirects`, etc. so this adds a notice to let users know when we are using `next export` output.
2023-01-26 04:54:24 +00:00
Ethan Arrowood
52ca35252a [tests] add gatsby related unit tests to static-build (#9318)
Adds some unit tests for the gatsby injection logic. 

These are net new since we've heavily changed the injection logic. The tests and fixtures in `build-fixtures` and `builds.test.js` are seemingly not executed. We may delete those here too.
2023-01-26 03:33:18 +00:00
Steven
2004e3d734 [tests] Fix jest-haste-map: Haste module naming collision (#9317)
```
jest-haste-map: Haste module naming collision: app-three
The following files share their name; please adjust your hasteImpl:
  * <rootDir>/test/fixtures/33-hybrid-monorepo/backend/app-three/package.json
  * <rootDir>/test/fixtures/34-monorepo-no-workspaces/backend/app-three/package.json
```
2023-01-26 00:59:39 +00:00
Steven
49b4394c44 [next] Remove hardcoded NODE_OPTIONS (#9314)
The `NODE_OPTIONS` env var value is wrong here and we shouldn't be overriding the user's setting to begin with.

Typically, it will be assigned in a build container or project settings.
2023-01-25 22:53:06 +00:00
Steven
08cdfa2a05 [tests] Fix turbo config and bump to latest version (#9307)
We had Turborepo miscofigured. The `dependsOn: ["^build"]` means run all dependencies' build script, but it doesn't mean the package's build script will run first. For tests, it should always `dependsOn: ["build"]` (without the upcaret) instead to ensure its own build is complete before testing. I also learned that `outputs: []` can be dropped now since thats the default behavior.
2023-01-25 20:45:56 +00:00
Nathan Rajlich
f18fa8546f [static-build] Patch the gatsby-node config file to invoke @vercel/gatsby-plugin-vercel-builder (#9308)
We need to ensure that `@vercel/gatsby-plugin-vercel-builder` is executed as the very last plugin, which Gatsby itself doesn't really support in the plugins listing. So instead we'll patch the `gatsby-node` file to invoke the plugin in the project's own `onPostBuild()` hook, which does get executed last. The patching is similar to how is done with the `gatsby-config` file.
2023-01-25 19:07:04 +00:00
Swarnava Sengupta
025344c4a7 [examples] Update Parcel version to support Node 18 (#9310)
Looks like our Parcel examples are currently failing in Node 18.x.
Updating the version which also Support Node 18.x

<img width="814" alt="image"
src="https://user-images.githubusercontent.com/1162991/214491854-92dc39a4-d7b7-442a-a16c-88621b52621a.png">
2023-01-25 10:08:45 -05:00
Chris Swithinbank
8b036e97ea [frameworks] Add default immutable cache path for Astro v2 (#9305)
Astro v2 was released today. It includes [improved support for caching
all hashed build
assets](https://docs.astro.build/en/guides/upgrade-to/v2/#changed-_astro-folder-for-build-assets)
by gathering these all in a single `_astro` directory in build output
(previously these ended up in a number of different places).

This PR updates the Vercel frameworks config to provide out-of-the-box
immutable caching for these assets.

Co-authored-by: Steven <steven@ceriously.com>
2023-01-24 19:05:13 -05:00
Ethan Arrowood
a4240e89e1 [gatsby-plugin-vercel-builder] add support for pathPrefix (#9294)
This pr adds support for the gatsby `pathPrefix`  properties
2023-01-24 23:37:03 +00:00
Nathan Rajlich
0863ae0c6f [gatsby-plugin-vercel-builder] Get rid of the _ssr function (#9304)
Right now we create the SSR serverless function at path `_ssr`, and then symlink all the other pages to that function.

Instead just make the first page encountered be the "real" function, and symlink all the other pages to that endpoint.
2023-01-24 22:10:43 +00:00
github-actions[bot]
e09d3d5928 [examples] Upgrade Next.js to version 13.1.5 (#9288)
This auto-generated PR updates Next.js to version 13.1.5
2023-01-24 21:52:38 +00:00
Nathan Rajlich
f5f544ffd2 [gatsby-plugin-vercel-builder] Merge SSR and page-data into single Serverless Function (#9292)
Previously, the page-data Serverless Function and SSR Serverless Function were two distinct functions. They had almost identical file contents and just slightly different handler logic. So here we merge the handler logic into a single function and re-use the same Serverless Function for both page-data and SSR.

This simplifies the output quite a bit and deletes a good amount of code, and helps with build output size, cold boot times, etc.
2023-01-24 19:36:47 +00:00
Lee Robinson
4eb1ff8730 chore: Update Gatsby example README (#9300) 2023-01-24 18:48:04 +00:00
Chris Barber
d4b604f05c [tests] Added cron workflow to update turbo (#9301)
Linear: https://linear.app/vercel/issue/VCCLI-450/add-cron-update-turbo-workflow

Based on the Next.js update script, this PR adds a cron job that checks for the latest Turbo canary release, then updates the `package.json` and `pnpm-lock.yaml` files.
2023-01-24 18:17:39 +00:00
Sean Massa
a3cf05af06 [tests] Update CODEOWNERS (#9277) 2023-01-24 09:20:03 -06:00
Steven
df2bcec830 [edge] Add missing metadata to package.json (#9293)
I noticed this was missing when visiting https://www.npmjs.com/package/@vercel/edge
2023-01-24 09:11:14 +00:00
Nathan Rajlich
f5e81273af [static-build] Add Gatsby plugins as dev dependencies (#9298)
So that the test runs get invalidated by Turbo when code changes within one of the plugin files. This is to avoid a PR breaking something in a plugin, but we don't notice it because the static-build tests still "pass" due to cache hit.
2023-01-24 05:32:03 +00:00
Nathan Rajlich
e75d900eaf [static-build] Use tarball URLs for Gatsby plugin E2E tests (#9297)
When running the static-build integration tests, use the tarball URL for the Gatsby plugins, so that any changes made to the plugin(s) in the PR will be reflected in the test deployments without having to publish to npm first.

Example screenshots of a deployment's build logs before I removed the debugging:

<img width="1045" alt="Screenshot 2023-01-23 at 7 35 32 PM" src="https://user-images.githubusercontent.com/71256/214207250-24695a11-051d-4abb-a132-729ab0ab167a.png">
<img width="517" alt="Screenshot 2023-01-23 at 7 35 41 PM" src="https://user-images.githubusercontent.com/71256/214207254-74265e05-ed17-4fbb-b54d-9edc2234a3e4.png">
2023-01-24 04:43:07 +00:00
Chris Barber
1a4f185045 [build-utils] Removed unused execAsync() function (#9200)
This function can sometimes give a TypeScript error if @types/node says 'code' should be a `number | null` and yet the code is returns as a `number`:

```
@vercel/build-utils:build: src/fs/run-user-scripts.ts(140,13): error TS2322: Type 'number \| null' is not assignable to type 'number'.
@vercel/build-utils:build:   Type 'null' is not assignable to type 'number'.
```

I'm not sure if there are any other projects that depend on `@vercel/build-utils`, but it doesn't appear that the Vercel CLI nor the build container use this `execAsync()` function. If I'm mistaken, feel free to close this PR.

As this removal is a breaking change, setting the semver to major and not auto-merging unless others are in agreement.
2023-01-24 01:05:44 +00:00
Steven
35cc7db1a7 [cli] add missing deprecation warnings for builders/runtimes (#9289)
We used to print deprecations on the old build pipeline but this was lost when switching to `vercel build`.

This PR ensures that `vercel build` prints deprecation warnings if available.

## Example

https://npmjs.com/package/@now/node

<img width="751" alt="image" src="https://user-images.githubusercontent.com/229881/214143779-f71c88c8-6ee3-47ce-b459-a86154474991.png">
2023-01-23 23:51:58 +00:00
Nathan Rajlich
f535a20aad [gatsby-plugin-vercel-builder] Remove vercel.config.js file handling (#9290) 2023-01-23 21:31:49 +00:00
Ethan Arrowood
fcea36bf04 Publish Stable
- vercel@28.13.1
 - @vercel/gatsby-plugin-vercel-analytics@1.0.7
 - @vercel/gatsby-plugin-vercel-builder@1.0.0
2023-01-23 12:22:45 -07:00
Steven
93f5a4438b [cli] Improve error message when using legacy @now builders (#8677)
This PR adds a helpful error message when using a legacy `@now` builder.

This mimics the behavior of the build-container.
2023-01-23 17:49:24 +00:00
github-actions[bot]
72265aa9a1 [examples] Upgrade Next.js to version 13.1.4 (#9279)
This auto-generated PR updates Next.js to version 13.1.4

Co-authored-by: vercel-release-bot <infra+release@vercel.com>
2023-01-20 19:23:59 -05:00
Felix Haus
6ee5eb137b [examples] Restrict node version for gridsome example (#9275)
Gridsome was not updated for [more than 2 years](https://www.npmjs.com/package/gridsome) and still relies on Webpack 4 what makes it incompatible with >= Node.js 17.
To make the example still deployable on Vercel, this adds an engine restriction to the `package.json` for that example.

Same as #9007.
2023-01-20 23:32:04 +00:00
Ethan Arrowood
c4f1c2f5ed [gatsby-plugin-vercel-analytics] add build script (#9260)
Adds the previously omitted build script. Includes necessary changes to turbo.json too
2023-01-20 22:31:41 +00:00
Ethan Arrowood
f35a77c292 [gatsby-plugin-vercel-builder] support trailingSlash configuration (#9278)
Connects the gatsby trailingslash configuration property to our BOA3
trailingslash property
2023-01-20 14:15:58 -07:00
Andy McKay
4bf3c237ee [cli] Revert some tables back to stdout (#9227)
In https://github.com/vercel/vercel/pull/8735 some output was sent to `stderr`, this sends them `stdout` instead.
2023-01-20 20:58:28 +00:00
github-actions[bot]
62c991f25e [examples] Upgrade Next.js to version 13.1.3 (#9271)
This auto-generated PR updates Next.js to version 13.1.3
2023-01-20 20:39:26 +00:00
Nathan Rajlich
6ea2db4ae9 [gatsby-plugin-vercel-builder] Various fixes and refactoring (#9268)
* Sets a valid number of seconds for DSG expiration (10 minutes - do we want to make that configurable somehow?)
* Sets the `group` of DSG pages, so that the page-data and SSR pages are associated
* Outputs SSR/DSG pages with `/index.html` suffix so that those paths are accessible
* Updates SSR and page-data functions URL path parsing logic to handle querystrings and the `index.html` scenario
* Remove the unnecessary `rewrite` related to page-data URL paths
* Remove the page-data function static file fallback handling (they are accessible as just regular static files)
* Correct the path for the page-data endpoing when the root-level index page is SSR/DSG
2023-01-20 18:49:44 +00:00
Ethan Arrowood
1943b1ecc0 [gatsby-plugin-vercel-builder] add documentation to README (#9264)
Adds some basic documentation to the plugin
2023-01-20 18:07:25 +00:00
Ethan Arrowood
92f5b6e0c9 Publish Stable
- vercel@28.13.0
 - @vercel/gatsby-plugin-vercel-builder@0.1.2
 - @vercel/next@3.3.18
 - @vercel/node@2.8.15
 - @vercel/static-build@1.2.0
2023-01-20 10:27:31 -07:00
JJ Kasper
ed6ce1149a [next] Ensure outputDirectory is normalized (#9269)
This ensures we normalize an absolute `outputDirectory` passed into the Next.js builder as this can cause incorrect resolving. We should probably normalize this higher up as well but this is a starting point. 

Fixes: [slack thread](https://vercel.slack.com/archives/C0289CGVAR2/p1674176040821579?thread_ts=1674168748.257019&cid=C0289CGVAR2)
2023-01-20 17:08:32 +00:00
Ethan Arrowood
fc3611fb80 [static-build] add gatsby v5 fixture (#9267)
Adds a gatsby v5 fixture using ssr and dsg to demonstrate plugin works

Co-authored-by: Nathan Rajlich <n@n8.io>
2023-01-20 03:02:26 -08:00
Nathan Rajlich
ed33c2b27c [gatsby-plugin-vercel-builder] Remove some leftover console.log calls (#9265) 2023-01-19 16:05:48 -08:00
Nathan Rajlich
a7a5bf1a12 [tests] Pass in builder to runBuildLambda() (#9262)
Removes the need for `next`/`static-build` to be present in the root `package.json` file.
2023-01-19 15:10:06 -08:00
JJ Kasper
cc687b3880 [next] Correct RSC route for app dir with trailingSlash (#9263) 2023-01-19 15:03:58 -08:00
Nathan Rajlich
053ec92d5f [node] Add build logs probe for TypeScript usage messages (#9261)
Follow-up to https://github.com/vercel/vercel/pull/9258, even though that issue seemed to only happen when linked to the monorepo locally. In any case, this test will ensure those log messages are intact for any other change around that part of the codebase in the future.
2023-01-19 21:39:27 +00:00
Ethan Arrowood
4838dc336a [static-build] run pnpm install --lockfile-only after injecting gatsby plugins (#9259)
Fixes a bug where after injecting plugins and modifying the users package.json, pnpm would fail normal install
2023-01-19 20:07:53 +00:00
Nathan Rajlich
eae45f4019 [node] Fix TypeScript built-in compiler log message check (#9258)
This check started breaking probably after the pnpm switch, so switch to a simpler solution that doesn't require comparing the paths.
2023-01-19 13:12:31 +00:00
Ethan Arrowood
02feb564a7 [static-build][fs-detectors] add gatsby builder plugin injection (#9252)
Modifies the `@vercel/static-build` injection logic in order to now inject both the analytics and builder gatsby plugins.
2023-01-19 03:34:47 +00:00
Nathan Rajlich
e174a06673 Publish Stable
- vercel@28.12.8
 - @vercel/gatsby-plugin-vercel-builder@0.1.1
 - @vercel/next@3.3.17
 - @vercel/remix@1.2.7
2023-01-18 15:19:12 -08:00
Nathan Rajlich
de034943af [gatsby-plugin-vercel-builder] Fix turbo cache outputs configuration (#9257)
The published version on npm was missing the compiled code because Turbo
was not configured to cache them properly.
2023-01-18 15:10:28 -08:00
JJ Kasper
b3862271a5 [next] Fix normalize route with middleware and basePath (#9255)
This ensures we properly match the index route when normalizing routes with `x-nextjs-data` for middleware. 

x-ref: [slack thread](https://vercel.slack.com/archives/C03S8ED1DKM/p1674034533265989)
2023-01-18 21:30:30 +00:00
Steven
aaceeef604 [tests] Add test for ./examples/remix (#9251)
- Related to #9249
2023-01-18 19:13:55 +00:00
Steven
ad107ecf79 [tests] Split more dev tests (#9230)
This PR attempts to balance the tests so they run concurrently and therefore faster.

I also sorted the tests so they are deterministic when splitting/chunking.
2023-01-18 17:35:45 +00:00
Steven
79ef5c3724 Publish Stable
- vercel@28.12.7
 - @vercel/client@12.3.2
 - @vercel/gatsby-plugin-vercel-builder@0.1.0
 - @vercel/next@3.3.16
 - @vercel/node-bridge@3.1.10
 - @vercel/node@2.8.14
 - @vercel/remix@1.2.6
2023-01-18 08:39:44 -05:00
Nathan Rajlich
02ff265074 [remix] Fix config file dynamic import (#9249)
Follow-up to https://github.com/vercel/vercel/pull/8793, which breaks dynamic import of the config file, causing some issues for remix users: https://github.com/orgs/vercel/discussions/1282

The underlying error is `MODULE_NOT_FOUND`.

`import()` should work with `file://` URIs, however, since it's using typescript, the `import` gets compiled to `require()`, which does not support `file://` protocol scheme.

Supersedes https://github.com/vercel/vercel/pull/9248.
2023-01-18 11:24:42 +00:00
Ethan Arrowood
ae89b8b8be [gatsby-plugin-vercel-builder] Implement @vercel/gatsby-plugin-vercel-builder (#9218)
<picture data-single-emoji=":gatsby:" title=":gatsby:"><img class="emoji" src="https://single-emoji.vercel.app/api/emoji/eyJhbGciOiJkaXIiLCJlbmMiOiJBMjU2R0NNIn0..Dli3tTuity9MEBnK.YyaiT9ASg3XvFmlx5q0Ovkdbkto2fgJGjIJhsLcraR_hqYG0DAC6CcBMiaARcI_hF0502EnqhkrHQLeYfEoxfomLi9iKk4WBAe-oSfsENsG9oAwzYdH4LA.VMtHCMaOXNOqpB2xvph3Kg" alt=":gatsby:" width="20" height="auto" align="absmiddle"></picture> 

Implement the Build Output API v3 Gatsby plugin for use within `@vercel/static-build`.

Supersedes https://github.com/vercel/vercel/pull/8259.
2023-01-18 02:24:32 +00:00
Steven
4ccdcde463 Publish Stable
- @vercel/build-utils@5.9.0
 - vercel@28.12.6
 - @vercel/client@12.3.1
 - @vercel/fs-detectors@3.7.5
 - @vercel/go@2.2.30
 - @vercel/hydrogen@0.0.44
 - @vercel/next@3.3.15
 - @vercel/node@2.8.13
 - @vercel/python@3.1.40
 - @vercel/redwood@1.0.51
 - @vercel/remix@1.2.5
 - @vercel/ruby@1.3.56
 - @vercel/static-build@1.1.7
2023-01-17 19:25:26 -05:00
Nathan Rajlich
22d3ee160b [build-utils] Add includeDirectories option to glob() (#9245)
Instead of always including empty directories in `glob()`, make it an
opt-in behavior because technically it's a breaking change to include
them by default.
2023-01-17 19:21:40 -05:00
Sean Massa
6d97e1673e Publish Stable
- vercel@28.12.5
 - @vercel/client@12.3.0
 - @vercel/node-bridge@3.1.9
 - @vercel/node@2.8.12
2023-01-17 13:46:58 -06:00
Nathan Rajlich
522565f6e5 [node-bridge] Add missing lazy dependencies (#9231)
These were missing from the compiled ncc build of `helpers.ts`, and thus causing an error at runtime because the deps are not available within the Serverless Function.
2023-01-14 01:41:50 +00:00
Steven
07bf81ab10 [client] Add sliding window delay when polling deployment complete (#9222)
The previous delay of 1500ms was causing some users to hit the API rate limits. This doesn't normally happen with a single deployment, but it can happen with several concurrent deployments (for example a monorepo with many projects).

We don't need to be polling so often, so this PR changed the polling delay to the following:

- During 0s-15s: check every 1 second
- During 15s-60s: check every 5 seconds
- During 1m-5m: check every 15 seconds
- During 5m-10m: check every 30 seconds
2023-01-14 00:27:14 +00:00
Steven
35024a4e3a [tests] Split up dev tests to increase concurrency (#9228)
This will speed up CI because we can run more tests concurrently.

Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com>
2023-01-13 18:53:57 -05:00
Sean Massa
c1df9bca19 [tests] skip private packages (#9229) 2023-01-13 17:24:50 -06:00
Nathan Rajlich
4c1cdd1f0f [client] Send empty directory entries to POST create deployment (#9118)
Update `@vercel/client` to send empty directory entries to the `POST` create deployment endpoint. This makes it so that CLI deployments will have empty directories re-populated in the build-container when doing `vc deploy`.

Follow-up to #9103.
2023-01-13 22:24:31 +00:00
Sean Massa
b5cdc82a1c Publish Stable
- @vercel/build-utils@5.8.3
 - vercel@28.12.4
 - @vercel/client@12.2.31
 - @vercel/edge@0.2.6
 - @vercel/error-utils@1.0.8
 - @vercel/frameworks@1.2.4
 - @vercel/fs-detectors@3.7.4
 - @vercel/gatsby-plugin-vercel-analytics@1.0.6
 - @vercel/go@2.2.29
 - @vercel/hydrogen@0.0.43
 - @vercel/next@3.3.14
 - @vercel/node-bridge@3.1.8
 - @vercel/node@2.8.11
 - @vercel/python@3.1.39
 - @vercel/redwood@1.0.50
 - @vercel/remix@1.2.4
 - @vercel/routing-utils@2.1.8
 - @vercel/ruby@1.3.55
 - @vercel/static-build@1.1.6
 - @vercel/static-config@2.0.11
2023-01-13 15:45:03 -06:00
Nathan Rajlich
c7851404b3 [*] Remove "workspace:" (#9225) 2023-01-13 15:42:29 -06:00
Sean Massa
e54da8a2e5 Publish Stable
- @vercel/build-utils@5.8.2
 - vercel@28.12.3
 - @vercel/client@12.2.30
 - @vercel/edge@0.2.5
 - @vercel/error-utils@1.0.7
 - @vercel/frameworks@1.2.3
 - @vercel/fs-detectors@3.7.3
 - @vercel/gatsby-plugin-vercel-analytics@1.0.5
 - @vercel/go@2.2.28
 - @vercel/hydrogen@0.0.42
 - @vercel/next@3.3.13
 - @vercel/node-bridge@3.1.7
 - @vercel/node@2.8.10
 - @vercel/python@3.1.38
 - @vercel/redwood@1.0.49
 - @vercel/remix@1.2.3
 - @vercel/routing-utils@2.1.7
 - @vercel/ruby@1.3.54
 - @vercel/static-build@1.1.5
 - @vercel/static-config@2.0.10
2023-01-13 15:06:45 -06:00
Sean Massa
a066bedf95 add the lockfile before commit 2023-01-13 15:06:16 -06:00
Sean Massa
09b23e53ba update lockfile because lerna did not do its job 2023-01-13 15:05:29 -06:00
Sean Massa
b793a67588 Publish Stable
- @vercel/build-utils@5.8.1
 - vercel@28.12.2
 - @vercel/client@12.2.29
 - @vercel/edge@0.2.4
 - @vercel/error-utils@1.0.6
 - @vercel/frameworks@1.2.2
 - @vercel/fs-detectors@3.7.2
 - @vercel/gatsby-plugin-vercel-analytics@1.0.4
 - @vercel/go@2.2.27
 - @vercel/hydrogen@0.0.41
 - @vercel/next@3.3.12
 - @vercel/node-bridge@3.1.6
 - @vercel/node@2.8.9
 - @vercel/python@3.1.37
 - @vercel/redwood@1.0.48
 - @vercel/remix@1.2.2
 - @vercel/routing-utils@2.1.6
 - @vercel/ruby@1.3.53
 - @vercel/static-build@1.1.4
 - @vercel/static-config@2.0.9
2023-01-13 15:01:55 -06:00
Sean Massa
31dd354b3a remove unnecessary checkout 2023-01-13 15:00:52 -06:00
Sean Massa
529ff3b2d7 add "version" lifecycle hook for lerna to update lockfile before commit 2023-01-13 14:59:49 -06:00
Sean Massa
e71d5638ee Publish Stable
- @vercel/build-utils@5.8.0
 - vercel@28.12.1
 - @vercel/client@12.2.28
 - @vercel/edge@0.2.3
 - @vercel/error-utils@1.0.5
 - @vercel/frameworks@1.2.1
 - @vercel/fs-detectors@3.7.1
 - @vercel/gatsby-plugin-vercel-analytics@1.0.3
 - @vercel/go@2.2.26
 - @vercel/hydrogen@0.0.40
 - @vercel/next@3.3.11
 - @vercel/node-bridge@3.1.5
 - @vercel/node@2.8.8
 - @vercel/python@3.1.36
 - @vercel/redwood@1.0.47
 - @vercel/remix@1.2.1
 - @vercel/routing-utils@2.1.5
 - @vercel/ruby@1.3.52
 - @vercel/static-build@1.1.3
 - @vercel/static-config@2.0.8
2023-01-13 14:47:25 -06:00
Ethan Arrowood
8c16e765ee [tests] adjust root and api deps to use worskpace:* (#9223) 2023-01-13 14:29:23 -06:00
Ethan Arrowood
a008c9c7fe [build-utils] Support empty directory entries for glob() and download() (#9164)
Reopening #9103 since it was reverted.
2023-01-13 12:01:37 -08:00
Sean Massa
62b28ad0b4 Publish Stable
- @vercel/build-utils@5.7.6
 - vercel@28.12.0
 - @vercel/client@12.2.27
 - @vercel/edge@0.2.2
 - @vercel/error-utils@1.0.4
 - @vercel/frameworks@1.2.0
 - @vercel/fs-detectors@3.7.0
 - @vercel/gatsby-plugin-vercel-analytics@1.0.2
 - @vercel/go@2.2.25
 - @vercel/hydrogen@0.0.39
 - @vercel/next@3.3.10
 - @vercel/node-bridge@3.1.4
 - @vercel/node@2.8.7
 - @vercel/python@3.1.35
 - @vercel/redwood@1.0.46
 - @vercel/remix@1.2.0
 - @vercel/routing-utils@2.1.4
 - @vercel/ruby@1.3.51
 - @vercel/static-build@1.1.2
 - @vercel/static-config@2.0.7
2023-01-13 14:00:46 -06:00
Sean Massa
7c50f2916e [frameworks] update framework detectors with packages (#9167)
In [a different PR](https://github.com/vercel/vercel/pull/9009), detecting frameworks by package name will also provide framework version metadata to the build. Should we update these framework detectors to look up their respective packages or were they not doing that already for a reason?

I left the old detectors in place as fallbacks, which looks like:

```
some: [
  {
    path: 'package.json',
    matchContent:
      '"(dev)?(d|D)ependencies":\\s*{[^}]*"remix":\\s*".+?"[^}]*}',
  },
  {
    path: 'remix.config.js',
  },
],
```

Please review carefully.
2023-01-13 19:04:59 +00:00
github-actions[bot]
a521dadafb [examples] Upgrade Next.js to version 13.1.2 (#9217)
This auto-generated PR updates Next.js to version 13.1.2

Co-authored-by: vercel-release-bot <infra+release@vercel.com>
2023-01-13 13:26:57 -05:00
Steven
1efb5d6c0d [fs-detectors] Add support for api/**/*.tsx zero config detection (#9216)
There are cases where you would want to use `.tsx` with Serverless Functions, such as OG Image Generation.

This PR adds zero config detection for `.tsx` file extensions and also consolidates the glob pattern into a single string matching all valid `@vercel/node` cases.

https://twitter.com/hasparus/status/1593136849404694528

https://linear.app/vercel/issue/VCCLI-322
2023-01-13 17:59:49 +00:00
Sean Massa
72df5ce8f6 [examples][frameworks] add tests for all examples being detected (#9197)
This PR adds tests (under `packages/fs-detectors`) that ensure are `./examples` get detected as the appropriate framework.
2023-01-13 17:24:00 +00:00
Chris Barber
e20b74687f [docs] Updated local dev instructions in CLI readme (#9215)
When I run `pnpm`, it just prints the help screen. I need to specify the install command `pnpm i`.
2023-01-13 16:25:53 +00:00
Sean Massa
8f1358bd15 [cli][frameworks][fs-detectors][next] detect framework versions (#9009)
This PR:

- updates `packages/frameworks` to have most supported frameworks specify which dependency version should reflect the overall framework version
- updates `packages/fs-detectors` to allow framework detection that returns the full `Framework` record instead of just the slug
- updates `packages/next` to return the detected Next.js version in the build result
- updates `packages/cli` to leverage these changes so that `vc build` can add `framework: { version: string; }` to `config.json` output

The result is that Build Output API and supported frameworks will return their framework version in the build result of `vc build` when possible, which is used by the build container  when creating the deployment. The dashboard later retrieves this value to display in richer deployment outputs.

Supports:

- https://github.com/vercel/api/pull/15601
- https://github.com/vercel/front/pull/18319

---

With the related build container updates, we get to see Next.js version in the build output. You'll see this with BOA+Prebuilt or a normal deploy:

<img width="1228" alt="Screen Shot 2022-12-09 at 2 48 12 PM" src="https://user-images.githubusercontent.com/41545/206793639-f9cd3bdf-b822-45dd-9564-95b94994271d.png">

---

### The Path to this PR

I went through all the supported frameworks and figured out how to best determine their versions. For most of them, we can check a known dependency's installed version number. 

We can get most of the way only checking npm. For a handful, we'd have to support Go/Ruby/Rust/Whatever dependencies.

I started with a more complex method signature to allow for later expansion without changing the signature. It looked like this, in practice:

```
async getVersion(dependencies: DependencyMap) => depedencies['next']
```

However, after checking all currently supported frameworks, I don't think this will end up being necessary. It also has the constraint that all dependencies have to be gathered and presented to the function even though it only needs to check for one or two. That's not a huge deal if we have them already where we need them, but we don't. We could use a variant here where this function does its own lookups, but this seemed unnecessary and would beg for duplication and small variances that could cause bugs.

Further, if we only look at `package.json`, we're going to either see a specific version of a version range. To be precise, we have to look at the installed version of the package. That means checking one of the various types of lockfiles that can exist or poking into node_modules.

If we poke into node_modules to detect the installed version, we introduce another point where Yarn 3 (default mode) will not be supported. If we read lockfiles, we have to potentially parse `npm`, `pnpm`, and `yarn` lockfiles.

If we use `npm ls <package-name>`, that also fails in Yarn 3 (default mode). We could accept that and go forward anyway, which would look like:

```
const args = `ls ${packageName} --depth=0 --json`.split(' ');
const { stdout } = await execa('npm', args, { cwd });
const regex = new RegExp(String.raw`${packageName}@([\.\d]+)`);
const matches = stdout.match(regex);
if (matches) {
  return matches[1];
}
```

But it turns out there's a `--json` option! That's what I ended up using, for now.

We could explore the lockfile route more, but after some initial digging, it' non-trivial. There are 3 main lockfiles we'd want to check for (npm, pnpm, and yarn) and there are different lockfile versions that put necessary data in different places. I looked for existing tools that parse this, but I didn't find any. We could certainly go down this path, but the effort doesn't seem worth it when `npm ls` gets us really close.

---

### Follow-up Versioning

Now that we know how to determine version per framework, we can vary configuration by version. In a future PR, we could allow a given value to vary by version number:

```
name: (version) => {
  if (semver.gt(version, '9.8.7')) {
    return 'some-framework-2''
  }

  return 'some-framework';
}
```

However, it may still be easier to differentiate significant versions by adding multiple entries in the list.
2023-01-13 07:50:00 +00:00
Logan McAnsh
74c0b3e1bb [remix] Add support for .cjs and .mjs extensions for "remix.config" (#8793)
Adds support for `remix.config.mjs` and `remix.config.cjs` and
also updates the example/fixtures to the latest version of Remix.

See: https://github.com/remix-run/remix/pull/3675
2023-01-12 19:26:52 -08:00
Chris Barber
eb0a031aeb [tests] Adding missing jest-matcher-utils dependency (#9214)
Fixes the following error when tests are run:

```
Cannot find module 'jest-matcher-utils' from 'test/mocks/matchers/to-output.ts'
```

Log: https://github.com/vercel/vercel/actions/runs/3898221824/jobs/6656730316#step:10:3049
2023-01-12 23:32:11 +00:00
Chris Barber
f327be2d1f [cli] Bumped get latest version test timeouts (#9213) 2023-01-12 22:45:21 +00:00
Ethan Arrowood
16a5867f6b [fs-detectors] use project path instead of name for turbo filter (#9210)
Fixes: https://github.com/orgs/vercel/discussions/1218
2023-01-12 18:01:38 +00:00
Steven
90cbd675fa [cli] Remove qs dependency (#9206)
There is no need for `qs` because the query string for `teamId` is already handled here:

5b88f673f8/packages/cli/src/util/client.ts (L99)
2023-01-12 02:01:16 +00:00
Ethan Arrowood
9c768b98b7 [tests] Migrate from yarn to pnpm (#9198)
<picture data-single-emoji=":pnpm:" title=":pnpm:"><img class="emoji" src="https://single-emoji.vercel.app/api/emoji/eyJhbGciOiJkaXIiLCJlbmMiOiJBMjU2R0NNIn0..4mJzrO94AnSn0Pue.4apgaKtTUdQ-wxNyahjdJj28u8bbXreLoTA8AGqYjLta3MrsFvbo9DsQFth4CoIkBgXFhQ5_BVcKNfYbwLg4bKzyIvItKe4OFS8AzG7Kkicz2kUUZk0.nXyK_PvHzZFGA-MQB6XHfA" alt=":pnpm:" width="20" height="auto" align="absmiddle"></picture> 

yarn has become increasingly more difficult to use as the v1 we rely on no longer receives updates. pnpm is faster and is actively maintained. 

This PR migrates us to pnpm.
2023-01-11 23:35:13 +00:00
JJ Kasper
4c3bc05322 Publish Stable
- @vercel/build-utils@5.7.5
 - vercel@28.11.1
 - @vercel/client@12.2.26
 - @vercel/fs-detectors@3.6.2
 - @vercel/gatsby-plugin-vercel-analytics@1.0.1
 - @vercel/go@2.2.24
 - @vercel/hydrogen@0.0.38
 - @vercel/next@3.3.9
 - @vercel/node@2.8.6
 - @vercel/python@3.1.34
 - @vercel/redwood@1.0.45
 - @vercel/remix@1.1.7
 - @vercel/ruby@1.3.50
 - @vercel/static-build@1.1.1
2023-01-11 12:01:03 -08:00
JJ Kasper
3f47587a8b [next] fix lambda creation when using edge runtime (#9204)
Co-authored-by: remorses <beats.by.morse@gmail.com>
2023-01-11 09:35:21 -08:00
Nathan Rajlich
84f93d8af4 [build-utils] Support directory entries in Lambda#createZip() (#9201)
Adds support for empty directory entries in the Lambda `createZip()` function.
2023-01-11 10:21:42 +00:00
Chris Barber
e1aaf8080b [cli] Replace update-notifier dependency with build in (#9098)
This PR replaces the `update-notifier` dependency with a custom implementation.

There are a few reasons: the dependency is quite large, it requires ESM in order to update, can sometimes suggest an update to an older version, and used dependencies with known security issues.

The result looks like:

<img width="768" alt="image" src="https://user-images.githubusercontent.com/97262/208452226-b7508299-f830-4d42-a96a-7646ec8227aa.png">

Note: This PR is the successor to https://github.com/vercel/vercel/pull/8090.
2023-01-11 03:45:36 +00:00
JJ Kasper
0857352967 [next] Fix dynamic routes order for app dir (#9202)
This ensures the RSC dynamic routes come before the HTML route as it's more specific and the HTML route can end up matching first unexpectedly. 

Test deployment with fix: https://discord-gmki6w031-vtest314-ijjk-testing.vercel.app/

Fixes: https://github.com/vercel/next.js/issues/44728
2023-01-11 00:24:28 +00:00
Chris Barber
f92d229a63 [cli] Rollback team check (#9120)
https://linear.app/vercel/issue/VCCLI-377/rollback-failing-for-enterprise-teams

When running `vc rollback` for a deployment belonging to another team, the command will fail requesting the rollback. Technically, the command should have failed trying to get the deployment. This PR checks that the current team matches the deployment being rolled back.

![image](https://user-images.githubusercontent.com/97262/210431585-ffb73658-b15c-4adb-b110-a8c5e816db32.png)

This PR also cleans up a bunch of deployment related things. There were 3 functions to get a deployment and now there's just one which uses the latest v13 API. Get deployment error handling now throws instead of returning an error. The `Deployment` type definition has been updated to match the v13 response and mock deployment data was also updated.
2023-01-10 21:05:08 +00:00
Steven
427a2a58cf [tests] Update turbo to 1.7.0-canary.9 (#9193)
Let's try out turbo canary

Co-authored-by: tknickman <tom.knickman@vercel.com>
2023-01-10 12:51:12 -05:00
Ethan Arrowood
ccb5f301ad [static-build] @vercel/static-build to use @vercel/gatsby-plugin-vercel-analytics (#9194)
Updates the `static-build` injector to use the new plugin.

Still need to verify somehow that the newly published plugin is working as expected. It should be fine since it was a copy-paste from the previous plugin repo, but always good to verify before we break everything! 

This PR also updates the README in `@vercel/gatsby-plugin-vercel-analytics`
2023-01-10 15:22:31 +00:00
507 changed files with 107668 additions and 23189 deletions

View File

@@ -43,5 +43,8 @@ packages/static-build/test/cache-fixtures
# redwood # redwood
packages/redwood/test/fixtures packages/redwood/test/fixtures
# remix
packages/remix/test/fixtures
# gatsby-plugin-vercel-analytics # gatsby-plugin-vercel-analytics
packages/gatsby-plugin-vercel-analytics packages/gatsby-plugin-vercel-analytics

3
.github/CODEOWNERS vendored
View File

@@ -3,9 +3,6 @@
* @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood * @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood
/.github/workflows @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @ijjk /.github/workflows @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @ijjk
/packages/cli/src/commands/domains @mglagola @anatrajkovska
/packages/cli/src/commands/certs @mglagola @anatrajkovska
/packages/cli/src/commands/env @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood
/packages/fs-detectors @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @agadzik @chloetedder /packages/fs-detectors @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @agadzik @chloetedder
/packages/node-bridge @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @ijjk /packages/node-bridge @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @ijjk
/packages/next @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @ijjk /packages/next @TooTallNate @EndangeredMassa @styfle @cb1kenobi @Ethan-Arrowood @ijjk

View File

@@ -6,7 +6,7 @@ Please read our [Code of Conduct](CODE_OF_CONDUCT.md) and follow it in all your
## Local development ## Local development
This project is configured in a monorepo, where one repository contains multiple npm packages. Dependencies are installed and managed with `yarn`, not `npm` CLI. This project is configured in a monorepo, where one repository contains multiple npm packages. Dependencies are installed and managed with `pnpm`, not `npm` CLI.
To get started, execute the following: To get started, execute the following:
@@ -14,22 +14,22 @@ To get started, execute the following:
git clone https://github.com/vercel/vercel git clone https://github.com/vercel/vercel
cd vercel cd vercel
corepack enable corepack enable
yarn install pnpm install
yarn bootstrap pnpm bootstrap
yarn build pnpm build
yarn lint pnpm lint
yarn test-unit pnpm test-unit
``` ```
Make sure all the tests pass before making changes. Make sure all the tests pass before making changes.
### Running Vercel CLI Changes ### Running Vercel CLI Changes
You can use `yarn dev` from the `cli` package to invoke Vercel CLI with local changes: You can use `pnpm dev` from the `cli` package to invoke Vercel CLI with local changes:
``` ```
cd ./packages/cli cd ./packages/cli
yarn dev <cli-commands...> pnpm dev <cli-commands...>
``` ```
See [CLI Local Development](../packages/cli#local-development) for more details. See [CLI Local Development](../packages/cli#local-development) for more details.
@@ -39,7 +39,7 @@ See [CLI Local Development](../packages/cli#local-development) for more details.
Once you are done with your changes (we even suggest doing it along the way), make sure all the tests still pass by running: Once you are done with your changes (we even suggest doing it along the way), make sure all the tests still pass by running:
``` ```
yarn test-unit pnpm test-unit
``` ```
from the root of the project. from the root of the project.
@@ -102,7 +102,7 @@ When you run this script, you'll see all the imported files. If anything file is
Sometimes you want to test changes to a Builder against an existing project, maybe with `vercel dev` or actual deployment. You can avoid publishing every Builder change to npm by uploading the Builder as a tarball. Sometimes you want to test changes to a Builder against an existing project, maybe with `vercel dev` or actual deployment. You can avoid publishing every Builder change to npm by uploading the Builder as a tarball.
1. Change directory to the desired Builder `cd ./packages/node` 1. Change directory to the desired Builder `cd ./packages/node`
2. Run `yarn build` to compile typescript and other build steps 2. Run `pnpm build` to compile typescript and other build steps
3. Run `npm pack` to create a tarball file 3. Run `npm pack` to create a tarball file
4. Run `vercel *.tgz` to upload the tarball file and get a URL 4. Run `vercel *.tgz` to upload the tarball file and get a URL
5. Edit any existing `vercel.json` project and replace `use` with the URL 5. Edit any existing `vercel.json` project and replace `use` with the URL

27
.github/workflows/cron-update-turbo.yml vendored Normal file
View File

@@ -0,0 +1,27 @@
name: Cron Update Turbo
on:
# Run every week https://crontab.guru/every-week
schedule:
- cron: '0 0 * * 0'
jobs:
create-pull-request:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
# 0 means fetch all commits so we can commit and push in the script below
with:
fetch-depth: 0
- name: install pnpm@7.26.0
run: npm i -g pnpm@7.26.0
- name: Create Pull Request
uses: actions/github-script@v6
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
# See https://github.com/actions/github-script#run-a-separate-file-with-an-async-function
with:
script: |
const script = require('./utils/update-turbo.js')
await script({ github, context })

View File

@@ -39,7 +39,6 @@ jobs:
- name: Setup Node - name: Setup Node
if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }} if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }}
uses: actions/setup-node@v3 uses: actions/setup-node@v3
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with: with:
node-version: 14 node-version: 14
- name: Cache - name: Cache
@@ -47,20 +46,24 @@ jobs:
uses: actions/cache@v3 uses: actions/cache@v3
with: with:
path: '**/node_modules' path: '**/node_modules'
key: yarn-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('yarn.lock') }} key: pnpm-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('pnpm-lock.yaml') }}
restore-keys: yarn-${{ matrix.os }}-${{ matrix.node }} restore-keys: pnpm-${{ matrix.os }}-${{ matrix.node }}
env:
SEGMENT_DOWNLOAD_TIMEOUT_MINS: 5 # See https://github.com/actions/cache/issues/810
- name: install pnpm@7.24.2
run: npm i -g pnpm@7.24.2
- name: Install - name: Install
if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }} if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }}
run: yarn install --check-files --frozen-lockfile --network-timeout 1000000 run: pnpm install
- name: Build - name: Build
if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }} if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }}
run: yarn build run: pnpm build
env: env:
GA_TRACKING_ID: ${{ secrets.GA_TRACKING_ID }} GA_TRACKING_ID: ${{ secrets.GA_TRACKING_ID }}
SENTRY_DSN: ${{ secrets.SENTRY_DSN }} SENTRY_DSN: ${{ secrets.SENTRY_DSN }}
- name: Publish - name: Publish
if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }} if: ${{ steps.check-release.outputs.IS_RELEASE == 'true' }}
run: yarn publish-from-github run: pnpm publish-from-github
env: env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN_ELEVATED }} NPM_TOKEN: ${{ secrets.NPM_TOKEN_ELEVATED }}
GA_TRACKING_ID: ${{ secrets.GA_TRACKING_ID }} GA_TRACKING_ID: ${{ secrets.GA_TRACKING_ID }}

View File

@@ -35,17 +35,20 @@ jobs:
with: with:
go-version: '1.18' go-version: '1.18'
- uses: actions/setup-node@v3 - uses: actions/setup-node@v3
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with: with:
node-version: ${{ matrix.node }} node-version: ${{ matrix.node }}
- uses: actions/cache@v3 - uses: actions/cache@v3
with: with:
path: '**/node_modules' path: '**/node_modules'
key: yarn-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('yarn.lock') }} key: pnpm-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('pnpm-lock.yaml') }}
restore-keys: yarn-${{ matrix.os }}-${{ matrix.node }} restore-keys: pnpm-${{ matrix.os }}-${{ matrix.node }}
- run: yarn install --network-timeout 1000000 --frozen-lockfile env:
- run: yarn run build SEGMENT_DOWNLOAD_TIMEOUT_MINS: 5 # See https://github.com/actions/cache/issues/810
- run: yarn test-integration-cli - name: install pnpm@7.24.2
run: npm i -g pnpm@7.24.2
- run: pnpm install
- run: pnpm run build
- run: pnpm test-integration-cli
env: env:
VERCEL_TEST_TOKEN: ${{ secrets.VERCEL_TEST_TOKEN }} VERCEL_TEST_TOKEN: ${{ secrets.VERCEL_TEST_TOKEN }}
VERCEL_TEST_REGISTRATION_URL: ${{ secrets.VERCEL_TEST_REGISTRATION_URL }} VERCEL_TEST_REGISTRATION_URL: ${{ secrets.VERCEL_TEST_REGISTRATION_URL }}

View File

@@ -35,20 +35,23 @@ jobs:
with: with:
fetch-depth: 2 fetch-depth: 2
- uses: actions/setup-node@v3 - uses: actions/setup-node@v3
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with: with:
node-version: ${{ matrix.node }} node-version: ${{ matrix.node }}
- uses: actions/cache@v3 - uses: actions/cache@v3
with: with:
path: '**/node_modules' path: '**/node_modules'
key: yarn-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('yarn.lock') }} key: pnpm-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('pnpm-lock.yaml') }}
restore-keys: yarn-${{ matrix.os }}-${{ matrix.node }} restore-keys: pnpm-${{ matrix.os }}-${{ matrix.node }}
- run: yarn install --network-timeout 1000000 --frozen-lockfile env:
- run: yarn run build SEGMENT_DOWNLOAD_TIMEOUT_MINS: 5 # See https://github.com/actions/cache/issues/810
- run: yarn run lint - name: install pnpm@7.24.2
run: npm i -g pnpm@7.24.2
- run: pnpm install
- run: pnpm run build
- run: pnpm run lint
if: matrix.os == 'ubuntu-latest' && matrix.node == 14 # only run lint once if: matrix.os == 'ubuntu-latest' && matrix.node == 14 # only run lint once
- run: yarn run test-unit - run: pnpm run test-unit
- run: yarn workspace vercel run coverage - run: pnpm -C packages/cli run coverage
if: matrix.os == 'ubuntu-latest' && matrix.node == 14 # only run coverage once if: matrix.os == 'ubuntu-latest' && matrix.node == 14 # only run coverage once
env: env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }} CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -33,15 +33,18 @@ jobs:
with: with:
go-version: '1.13.15' go-version: '1.13.15'
- uses: actions/setup-node@v3 - uses: actions/setup-node@v3
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with: with:
node-version: ${{ env.NODE_VERSION }} node-version: ${{ env.NODE_VERSION }}
- uses: actions/cache@v3 - uses: actions/cache@v3
with: with:
path: '**/node_modules' path: '**/node_modules'
key: yarn-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('yarn.lock') }} key: pnpm-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('pnpm-lock.yaml') }}
restore-keys: yarn-${{ matrix.os }}-${{ matrix.node }} restore-keys: pnpm-${{ matrix.os }}-${{ matrix.node }}
- run: yarn install --network-timeout 1000000 --frozen-lockfile env:
SEGMENT_DOWNLOAD_TIMEOUT_MINS: 5 # See https://github.com/actions/cache/issues/810
- name: install pnpm@7.24.2
run: npm i -g pnpm@7.24.2
- run: pnpm install
- id: set-tests - id: set-tests
run: | run: |
TESTS_ARRAY=$(node utils/chunk-tests.js $SCRIPT_NAME) TESTS_ARRAY=$(node utils/chunk-tests.js $SCRIPT_NAME)
@@ -74,20 +77,24 @@ jobs:
with: with:
go-version: '1.13.15' go-version: '1.13.15'
- uses: actions/setup-node@v3 - uses: actions/setup-node@v3
timeout-minutes: 5 # See https://github.com/actions/cache/issues/810
with: with:
node-version: ${{ env.NODE_VERSION }} node-version: ${{ env.NODE_VERSION }}
- uses: actions/cache@v3 - uses: actions/cache@v3
with: with:
path: '**/node_modules' path: '**/node_modules'
key: yarn-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('yarn.lock') }} key: pnpm-${{ matrix.os }}-${{ matrix.node }}-${{ hashFiles('pnpm-lock.yaml') }}
restore-keys: yarn-${{ matrix.os }}-${{ matrix.node }} restore-keys: pnpm-${{ matrix.os }}-${{ matrix.node }}
env:
SEGMENT_DOWNLOAD_TIMEOUT_MINS: 5 # See https://github.com/actions/cache/issues/810
- name: Install Hugo - name: Install Hugo
if: matrix.runner == 'macos-latest' if: matrix.runner == 'macos-latest'
run: curl -L -O https://github.com/gohugoio/hugo/releases/download/v0.56.0/hugo_0.56.0_macOS-64bit.tar.gz && tar -xzf hugo_0.56.0_macOS-64bit.tar.gz && mv ./hugo packages/cli/test/dev/fixtures/08-hugo/ run: curl -L -O https://github.com/gohugoio/hugo/releases/download/v0.56.0/hugo_0.56.0_macOS-64bit.tar.gz && tar -xzf hugo_0.56.0_macOS-64bit.tar.gz && mv ./hugo packages/cli/test/dev/fixtures/08-hugo/
- run: yarn install --network-timeout 1000000 - name: install pnpm@7.24.2
run: npm i -g pnpm@7.24.2
- run: pnpm install
- name: Build ${{matrix.packageName}} and all its dependencies - name: Build ${{matrix.packageName}} and all its dependencies
run: node utils/gen.js && node_modules/.bin/turbo run build --cache-dir=".turbo" --scope=${{matrix.packageName}} --include-dependencies --no-deps run: node utils/gen.js && node_modules/.bin/turbo run build --cache-dir=".turbo" --scope=${{matrix.packageName}} --include-dependencies --no-deps

View File

@@ -1,4 +1,4 @@
#!/bin/sh #!/bin/sh
. "$(dirname "$0")/_/husky.sh" . "$(dirname "$0")/_/husky.sh"
yarn pre-commit pnpm pre-commit

5
.npmrc Normal file
View File

@@ -0,0 +1,5 @@
save-exact=true
hoist-pattern[]=!"**/@types/**"
hoist-pattern[]=!"**/typedoc"
hoist-pattern[]=!"**/typedoc-plugin-markdown"
hoist-pattern[]=!"**/typedoc-plugin-mdn-links"

View File

@@ -1 +0,0 @@
save-prefix ""

View File

@@ -33,9 +33,9 @@ For details on how to use Vercel, check out our [documentation](https://vercel.c
## Contributing ## Contributing
This project uses [yarn](https://yarnpkg.com/) to install dependencies and run scripts. This project uses [pnpm](https://pnpm.io/) to install dependencies and run scripts.
You can use the `dev` script to run local changes as if you were invoking Vercel CLI. For example, `vercel deploy --cwd=/path/to/project` could be run with local changes with `yarn dev deploy --cwd=/path/to/project`. You can use the `dev` script to run local changes as if you were invoking Vercel CLI. For example, `vercel deploy --cwd=/path/to/project` could be run with local changes with `pnpm dev deploy --cwd=/path/to/project`.
See the [Contributing Guidelines](./.github/CONTRIBUTING.md) for more details. See the [Contributing Guidelines](./.github/CONTRIBUTING.md) for more details.

View File

@@ -13,6 +13,8 @@ function initSentry() {
sentryInitDone = true; sentryInitDone = true;
init({ init({
// Cannot figure out whats going wrong here. VSCode resolves this fine. But when we build it blows up.
// @ts-ignore
dsn: assertEnv('SENTRY_DSN'), dsn: assertEnv('SENTRY_DSN'),
environment: process.env.NODE_ENV || 'production', environment: process.env.NODE_ENV || 'production',
release: `${serviceName}`, release: `${serviceName}`,

View File

@@ -4,9 +4,7 @@
"version": "0.0.0", "version": "0.0.0",
"description": "API for the vercel/vercel repo", "description": "API for the vercel/vercel repo",
"main": "index.js", "main": "index.js",
"scripts": { "scripts": {},
"//TODO": "We should add this pkg to yarn workspaces"
},
"dependencies": { "dependencies": {
"@sentry/node": "5.11.1", "@sentry/node": "5.11.1",
"got": "10.2.1", "got": "10.2.1",
@@ -16,9 +14,9 @@
"unzip-stream": "0.3.0" "unzip-stream": "0.3.0"
}, },
"devDependencies": { "devDependencies": {
"@types/node": "14.18.33", "@types/node": "16.18.11",
"@types/node-fetch": "2.5.4", "@types/node-fetch": "2.5.4",
"@vercel/node": "1.9.0", "@vercel/node": "*",
"typescript": "3.9.6" "typescript": "4.3.4"
} }
} }

View File

@@ -12,5 +12,5 @@
"resolveJsonModule": true, "resolveJsonModule": true,
"isolatedModules": true "isolatedModules": true
}, },
"include": ["examples", "frameworks.ts"] "include": ["examples", "frameworks.ts", "_lib"]
} }

View File

@@ -1,524 +0,0 @@
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1
"@sentry/apm@5.11.1":
version "5.11.1"
resolved "https://registry.yarnpkg.com/@sentry/apm/-/apm-5.11.1.tgz#cc89fa4150056fbf009f92eca94fccc3980db34e"
integrity sha512-4iZH11p/7w9IMLT9hqNY1+EqLESltiIoF6/YsbpK93sXWGEs8VQ83IuvGuKWxajvHgDmj4ND0TxIliTsYqTqFw==
dependencies:
"@sentry/browser" "5.11.1"
"@sentry/hub" "5.11.1"
"@sentry/minimal" "5.11.1"
"@sentry/types" "5.11.0"
"@sentry/utils" "5.11.1"
tslib "^1.9.3"
"@sentry/browser@5.11.1":
version "5.11.1"
resolved "https://registry.yarnpkg.com/@sentry/browser/-/browser-5.11.1.tgz#337ffcb52711b23064c847a07629e966f54a5ebb"
integrity sha512-oqOX/otmuP92DEGRyZeBuQokXdeT9HQRxH73oqIURXXNLMP3PWJALSb4HtT4AftEt/2ROGobZLuA4TaID6My/Q==
dependencies:
"@sentry/core" "5.11.1"
"@sentry/types" "5.11.0"
"@sentry/utils" "5.11.1"
tslib "^1.9.3"
"@sentry/core@5.11.1":
version "5.11.1"
resolved "https://registry.yarnpkg.com/@sentry/core/-/core-5.11.1.tgz#9e2da485e196ae32971545c1c49ee6fe719930e2"
integrity sha512-BpvPosVNT20Xso4gAV54Lu3KqDmD20vO63HYwbNdST5LUi8oYV4JhvOkoBraPEM2cbBwQvwVcFdeEYKk4tin9A==
dependencies:
"@sentry/hub" "5.11.1"
"@sentry/minimal" "5.11.1"
"@sentry/types" "5.11.0"
"@sentry/utils" "5.11.1"
tslib "^1.9.3"
"@sentry/hub@5.11.1":
version "5.11.1"
resolved "https://registry.yarnpkg.com/@sentry/hub/-/hub-5.11.1.tgz#ddcb865563fae53852d405885c46b4c6de68a91b"
integrity sha512-ucKprYCbGGLLjVz4hWUqHN9KH0WKUkGf5ZYfD8LUhksuobRkYVyig0ZGbshECZxW5jcDTzip4Q9Qimq/PkkXBg==
dependencies:
"@sentry/types" "5.11.0"
"@sentry/utils" "5.11.1"
tslib "^1.9.3"
"@sentry/minimal@5.11.1":
version "5.11.1"
resolved "https://registry.yarnpkg.com/@sentry/minimal/-/minimal-5.11.1.tgz#0e705d01a567282d8fbbda2aed848b4974cc3cec"
integrity sha512-HK8zs7Pgdq7DsbZQTThrhQPrJsVWzz7MaluAbQA0rTIAJ3TvHKQpsVRu17xDpjZXypqWcKCRsthDrC4LxDM1Bg==
dependencies:
"@sentry/hub" "5.11.1"
"@sentry/types" "5.11.0"
tslib "^1.9.3"
"@sentry/node@5.11.1":
version "5.11.1"
resolved "https://registry.yarnpkg.com/@sentry/node/-/node-5.11.1.tgz#2a9c18cd1209cfdf7a69b9d91303413149d2c910"
integrity sha512-FbJs0blJ36gEzE0rc2yBfA/KE+kXOLl8MUfFTcyJCBdCGF8XMETDCmgINnJ4TyBUJviwKoPw2TCk9TL2pa/A1w==
dependencies:
"@sentry/apm" "5.11.1"
"@sentry/core" "5.11.1"
"@sentry/hub" "5.11.1"
"@sentry/types" "5.11.0"
"@sentry/utils" "5.11.1"
cookie "^0.3.1"
https-proxy-agent "^4.0.0"
lru_map "^0.3.3"
tslib "^1.9.3"
"@sentry/types@5.11.0":
version "5.11.0"
resolved "https://registry.yarnpkg.com/@sentry/types/-/types-5.11.0.tgz#40f0f3174362928e033ddd9725d55e7c5cb7c5b6"
integrity sha512-1Uhycpmeo1ZK2GLvrtwZhTwIodJHcyIS6bn+t4IMkN9MFoo6ktbAfhvexBDW/IDtdLlCGJbfm8nIZerxy0QUpg==
"@sentry/utils@5.11.1":
version "5.11.1"
resolved "https://registry.yarnpkg.com/@sentry/utils/-/utils-5.11.1.tgz#aa19fcc234cf632257b2281261651d2fac967607"
integrity sha512-O0Zl4R2JJh8cTkQ8ZL2cDqGCmQdpA5VeXpuBbEl1v78LQPkBDISi35wH4mKmLwMsLBtTVpx2UeUHBj0KO5aLlA==
dependencies:
"@sentry/types" "5.11.0"
tslib "^1.9.3"
"@sindresorhus/is@^1.0.0":
version "1.2.0"
resolved "https://registry.yarnpkg.com/@sindresorhus/is/-/is-1.2.0.tgz#63ce3638cb85231f3704164c90a18ef816da3fb7"
integrity sha512-mwhXGkRV5dlvQc4EgPDxDxO6WuMBVymGFd1CA+2Y+z5dG9MNspoQ+AWjl/Ld1MnpCL8AKbosZlDVohqcIwuWsw==
"@szmarczak/http-timer@^4.0.0":
version "4.0.0"
resolved "https://registry.yarnpkg.com/@szmarczak/http-timer/-/http-timer-4.0.0.tgz#309789ccb7842ff1e41848cf43da587f78068836"
integrity sha512-3yoXv8OtGr/r3R5gaWWNQ3VUoQ5G3Gmo8DXX95V14ZVvE2b7Pj6Ide9uIDON8ym4D/ItyfL9ejohYUPqOyvRXw==
dependencies:
defer-to-connect "^1.1.1"
"@types/cacheable-request@^6.0.1":
version "6.0.1"
resolved "https://registry.yarnpkg.com/@types/cacheable-request/-/cacheable-request-6.0.1.tgz#5d22f3dded1fd3a84c0bbeb5039a7419c2c91976"
integrity sha512-ykFq2zmBGOCbpIXtoVbz4SKY5QriWPh3AjyU4G74RYbtt5yOc5OfaY75ftjg7mikMOla1CTGpX3lLbuJh8DTrQ==
dependencies:
"@types/http-cache-semantics" "*"
"@types/keyv" "*"
"@types/node" "*"
"@types/responselike" "*"
"@types/http-cache-semantics@*":
version "4.0.0"
resolved "https://registry.yarnpkg.com/@types/http-cache-semantics/-/http-cache-semantics-4.0.0.tgz#9140779736aa2655635ee756e2467d787cfe8a2a"
integrity sha512-c3Xy026kOF7QOTn00hbIllV1dLR9hG9NkSrLQgCVs8NF6sBU+VGWjD3wLPhmh1TYAc7ugCFsvHYMN4VcBN1U1A==
"@types/keyv@*":
version "3.1.1"
resolved "https://registry.yarnpkg.com/@types/keyv/-/keyv-3.1.1.tgz#e45a45324fca9dab716ab1230ee249c9fb52cfa7"
integrity sha512-MPtoySlAZQ37VoLaPcTHCu1RWJ4llDkULYZIzOYxlhxBqYPB0RsRlmMU0R6tahtFe27mIdkHV+551ZWV4PLmVw==
dependencies:
"@types/node" "*"
"@types/node-fetch@2.5.4":
version "2.5.4"
resolved "https://registry.yarnpkg.com/@types/node-fetch/-/node-fetch-2.5.4.tgz#5245b6d8841fc3a6208b82291119bc11c4e0ce44"
integrity sha512-Oz6id++2qAOFuOlE1j0ouk1dzl3mmI1+qINPNBhi9nt/gVOz0G+13Ao6qjhdF0Ys+eOkhu6JnFmt38bR3H0POQ==
dependencies:
"@types/node" "*"
"@types/node@*", "@types/node@13.1.4":
version "13.1.4"
resolved "https://registry.yarnpkg.com/@types/node/-/node-13.1.4.tgz#4cfd90175a200ee9b02bd6b1cd19bc349741607e"
integrity sha512-Lue/mlp2egZJoHXZr4LndxDAd7i/7SQYhV0EjWfb/a4/OZ6tuVwMCVPiwkU5nsEipxEf7hmkSU7Em5VQ8P5NGA==
"@types/responselike@*":
version "1.0.0"
resolved "https://registry.yarnpkg.com/@types/responselike/-/responselike-1.0.0.tgz#251f4fe7d154d2bad125abe1b429b23afd262e29"
integrity sha512-85Y2BjiufFzaMIlvJDvTTB8Fxl2xfLo4HgmHzVBz08w4wDePCTjYw66PdrolO0kzli3yam/YCgRufyo1DdQVTA==
dependencies:
"@types/node" "*"
"@vercel/node@1.9.0":
version "1.9.0"
resolved "https://registry.yarnpkg.com/@vercel/node/-/node-1.9.0.tgz#6b64f3b9a962ddb1089276fad00f441a1f4b9cf0"
integrity sha512-Vk/ZpuY4Cdc8oUwBi/kf8qETRaJb/KYdFddVkLuS10QwA0yJx+RQ11trhZ1KFUdc27aBr5S2k8/dDxK8sLr+IA==
dependencies:
"@types/node" "*"
ts-node "8.9.1"
typescript "3.9.3"
agent-base@5:
version "5.1.1"
resolved "https://registry.yarnpkg.com/agent-base/-/agent-base-5.1.1.tgz#e8fb3f242959db44d63be665db7a8e739537a32c"
integrity sha512-TMeqbNl2fMW0nMjTEPOwe3J/PRFP4vqeoNuQMG0HlMrtm5QxKqdvAkZ1pRBQ/ulIyDD5Yq0nJ7YbdD8ey0TO3g==
arg@^4.1.0:
version "4.1.3"
resolved "https://registry.yarnpkg.com/arg/-/arg-4.1.3.tgz#269fc7ad5b8e42cb63c896d5666017261c144089"
integrity sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==
binary@^0.3.0:
version "0.3.0"
resolved "https://registry.yarnpkg.com/binary/-/binary-0.3.0.tgz#9f60553bc5ce8c3386f3b553cff47462adecaa79"
integrity sha1-n2BVO8XOjDOG87VTz/R0Yq3sqnk=
dependencies:
buffers "~0.1.1"
chainsaw "~0.1.0"
bl@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/bl/-/bl-3.0.0.tgz#3611ec00579fd18561754360b21e9f784500ff88"
integrity sha512-EUAyP5UHU5hxF8BPT0LKW8gjYLhq1DQIcneOX/pL/m2Alo+OYDQAJlHq+yseMP50Os2nHXOSic6Ss3vSQeyf4A==
dependencies:
readable-stream "^3.0.1"
buffer-from@^1.0.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/buffer-from/-/buffer-from-1.1.1.tgz#32713bc028f75c02fdb710d7c7bcec1f2c6070ef"
integrity sha512-MQcXEUbCKtEo7bhqEs6560Hyd4XaovZlO/k9V3hjVUF/zwW7KBVdSK4gIt/bzwS9MbR5qob+F5jusZsb0YQK2A==
buffers@~0.1.1:
version "0.1.1"
resolved "https://registry.yarnpkg.com/buffers/-/buffers-0.1.1.tgz#b24579c3bed4d6d396aeee6d9a8ae7f5482ab7bb"
integrity sha1-skV5w77U1tOWru5tmorn9Ugqt7s=
cacheable-lookup@^0.2.1:
version "0.2.1"
resolved "https://registry.yarnpkg.com/cacheable-lookup/-/cacheable-lookup-0.2.1.tgz#f474ae2c686667d7ea08c43409ad31b2b31b26c2"
integrity sha512-BQ8MRjxJASEq2q+w0SusPU3B054gS278K8sj58QCLMZIso5qG05+MdCdmXxuyVlfvI8h4bPsNOavVUauVCGxrg==
dependencies:
keyv "^3.1.0"
cacheable-request@^7.0.0:
version "7.0.0"
resolved "https://registry.yarnpkg.com/cacheable-request/-/cacheable-request-7.0.0.tgz#12421aa084e943ec81eac8c93e56af90c624788a"
integrity sha512-UVG4gMn3WjnAeFBBx7RFoprgOANIAkMwN5Dta6ONmfSwrCxfm0Ip7g0mIBxIRJZX9aDsoID0Ry3dU5Pr0csKKA==
dependencies:
clone-response "^1.0.2"
get-stream "^5.1.0"
http-cache-semantics "^4.0.0"
keyv "^3.0.0"
lowercase-keys "^2.0.0"
normalize-url "^4.1.0"
responselike "^2.0.0"
chainsaw@~0.1.0:
version "0.1.0"
resolved "https://registry.yarnpkg.com/chainsaw/-/chainsaw-0.1.0.tgz#5eab50b28afe58074d0d58291388828b5e5fbc98"
integrity sha1-XqtQsor+WAdNDVgpE4iCi15fvJg=
dependencies:
traverse ">=0.3.0 <0.4"
chownr@^1.1.1:
version "1.1.3"
resolved "https://registry.yarnpkg.com/chownr/-/chownr-1.1.3.tgz#42d837d5239688d55f303003a508230fa6727142"
integrity sha512-i70fVHhmV3DtTl6nqvZOnIjbY0Pe4kAUjwHj8z0zAdgBtYrJyYwLKCCuRBQ5ppkyL0AkN7HKRnETdmdp1zqNXw==
clone-response@^1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/clone-response/-/clone-response-1.0.2.tgz#d1dc973920314df67fbeb94223b4ee350239e96b"
integrity sha1-0dyXOSAxTfZ/vrlCI7TuNQI56Ws=
dependencies:
mimic-response "^1.0.0"
cookie@^0.3.1:
version "0.3.1"
resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.3.1.tgz#e7e0a1f9ef43b4c8ba925c5c5a96e806d16873bb"
integrity sha1-5+Ch+e9DtMi6klxcWpboBtFoc7s=
debug@4:
version "4.1.1"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.1.1.tgz#3b72260255109c6b589cee050f1d516139664791"
integrity sha512-pYAIzeRo8J6KPEaJ0VWOh5Pzkbw/RetuzehGM7QRRX5he4fPHx2rdKMB256ehJCkX+XRQm16eZLqLNS8RSZXZw==
dependencies:
ms "^2.1.1"
decompress-response@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/decompress-response/-/decompress-response-5.0.0.tgz#7849396e80e3d1eba8cb2f75ef4930f76461cb0f"
integrity sha512-TLZWWybuxWgoW7Lykv+gq9xvzOsUjQ9tF09Tj6NSTYGMTCHNXzrPnD6Hi+TgZq19PyTAGH4Ll/NIM/eTGglnMw==
dependencies:
mimic-response "^2.0.0"
defer-to-connect@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/defer-to-connect/-/defer-to-connect-1.1.1.tgz#88ae694b93f67b81815a2c8c769aef6574ac8f2f"
integrity sha512-J7thop4u3mRTkYRQ+Vpfwy2G5Ehoy82I14+14W4YMDLKdWloI9gSzRbV30s/NckQGVJtPkWNcW4oMAUigTdqiQ==
diff@^4.0.1:
version "4.0.2"
resolved "https://registry.yarnpkg.com/diff/-/diff-4.0.2.tgz#60f3aecb89d5fae520c11aa19efc2bb982aade7d"
integrity sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A==
duplexer3@^0.1.4:
version "0.1.4"
resolved "https://registry.yarnpkg.com/duplexer3/-/duplexer3-0.1.4.tgz#ee01dd1cac0ed3cbc7fdbea37dc0a8f1ce002ce2"
integrity sha1-7gHdHKwO08vH/b6jfcCo8c4ALOI=
end-of-stream@^1.1.0, end-of-stream@^1.4.1:
version "1.4.4"
resolved "https://registry.yarnpkg.com/end-of-stream/-/end-of-stream-1.4.4.tgz#5ae64a5f45057baf3626ec14da0ca5e4b2431eb0"
integrity sha512-+uw1inIHVPQoaVuHzRyXd21icM+cnt4CzD5rW+NC1wjOUSTOs+Te7FOv7AhN7vS9x/oIyhLP5PR1H+phQAHu5Q==
dependencies:
once "^1.4.0"
fs-constants@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/fs-constants/-/fs-constants-1.0.0.tgz#6be0de9be998ce16af8afc24497b9ee9b7ccd9ad"
integrity sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow==
get-stream@^5.0.0, get-stream@^5.1.0:
version "5.1.0"
resolved "https://registry.yarnpkg.com/get-stream/-/get-stream-5.1.0.tgz#01203cdc92597f9b909067c3e656cc1f4d3c4dc9"
integrity sha512-EXr1FOzrzTfGeL0gQdeFEvOMm2mzMOglyiOXSTpPC+iAjAKftbr3jpCMWynogwYnM+eSj9sHGc6wjIcDvYiygw==
dependencies:
pump "^3.0.0"
got@10.2.1:
version "10.2.1"
resolved "https://registry.yarnpkg.com/got/-/got-10.2.1.tgz#7087485482fb31aa6e6399fd493dd04639da117b"
integrity sha512-IQX//hGm5oLjUj743GJG30U2RzjS58ZlhQQjwQXjsyR50TTD+etVMHlMEbNxYJGWVFa0ASgDVhRkAvQPe6M9iQ==
dependencies:
"@sindresorhus/is" "^1.0.0"
"@szmarczak/http-timer" "^4.0.0"
"@types/cacheable-request" "^6.0.1"
cacheable-lookup "^0.2.1"
cacheable-request "^7.0.0"
decompress-response "^5.0.0"
duplexer3 "^0.1.4"
get-stream "^5.0.0"
lowercase-keys "^2.0.0"
mimic-response "^2.0.0"
p-cancelable "^2.0.0"
responselike "^2.0.0"
to-readable-stream "^2.0.0"
type-fest "^0.8.0"
http-cache-semantics@^4.0.0:
version "4.0.3"
resolved "https://registry.yarnpkg.com/http-cache-semantics/-/http-cache-semantics-4.0.3.tgz#495704773277eeef6e43f9ab2c2c7d259dda25c5"
integrity sha512-TcIMG3qeVLgDr1TEd2XvHaTnMPwYQUQMIBLy+5pLSDKYFc7UIqj39w8EGzZkaxoLv/l2K8HaI0t5AVA+YYgUew==
https-proxy-agent@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/https-proxy-agent/-/https-proxy-agent-4.0.0.tgz#702b71fb5520a132a66de1f67541d9e62154d82b"
integrity sha512-zoDhWrkR3of1l9QAL8/scJZyLu8j/gBkcwcaQOZh7Gyh/+uJQzGVETdgT30akuwkpL8HTRfssqI3BZuV18teDg==
dependencies:
agent-base "5"
debug "4"
inherits@^2.0.3:
version "2.0.4"
resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.4.tgz#0fa2c64f932917c3433a0ded55363aae37416b7c"
integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==
json-buffer@3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/json-buffer/-/json-buffer-3.0.0.tgz#5b1f397afc75d677bde8bcfc0e47e1f9a3d9a898"
integrity sha1-Wx85evx11ne96Lz8Dkfh+aPZqJg=
keyv@^3.0.0, keyv@^3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/keyv/-/keyv-3.1.0.tgz#ecc228486f69991e49e9476485a5be1e8fc5c4d9"
integrity sha512-9ykJ/46SN/9KPM/sichzQ7OvXyGDYKGTaDlKMGCAlg2UK8KRy4jb0d8sFc+0Tt0YYnThq8X2RZgCg74RPxgcVA==
dependencies:
json-buffer "3.0.0"
lowercase-keys@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/lowercase-keys/-/lowercase-keys-2.0.0.tgz#2603e78b7b4b0006cbca2fbcc8a3202558ac9479"
integrity sha512-tqNXrS78oMOE73NMxK4EMLQsQowWf8jKooH9g7xPavRT706R6bkQJ6DY2Te7QukaZsulxa30wQ7bk0pm4XiHmA==
lru_map@^0.3.3:
version "0.3.3"
resolved "https://registry.yarnpkg.com/lru_map/-/lru_map-0.3.3.tgz#b5c8351b9464cbd750335a79650a0ec0e56118dd"
integrity sha1-tcg1G5Rky9dQM1p5ZQoOwOVhGN0=
make-error@^1.1.1:
version "1.3.6"
resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2"
integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==
mimic-response@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/mimic-response/-/mimic-response-1.0.1.tgz#4923538878eef42063cb8a3e3b0798781487ab1b"
integrity sha512-j5EctnkH7amfV/q5Hgmoal1g2QHFJRraOtmx0JpIqkxhBhI/lJSl1nMpQ45hVarwNETOoWEimndZ4QK0RHxuxQ==
mimic-response@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/mimic-response/-/mimic-response-2.0.0.tgz#996a51c60adf12cb8a87d7fb8ef24c2f3d5ebb46"
integrity sha512-8ilDoEapqA4uQ3TwS0jakGONKXVJqpy+RpM+3b7pLdOjghCrEiGp9SRkFbUHAmZW9vdnrENWHjaweIoTIJExSQ==
minimist@0.0.8:
version "0.0.8"
resolved "https://registry.yarnpkg.com/minimist/-/minimist-0.0.8.tgz#857fcabfc3397d2625b8228262e86aa7a011b05d"
integrity sha1-hX/Kv8M5fSYluCKCYuhqp6ARsF0=
mkdirp@^0.5.1:
version "0.5.1"
resolved "https://registry.yarnpkg.com/mkdirp/-/mkdirp-0.5.1.tgz#30057438eac6cf7f8c4767f38648d6697d75c903"
integrity sha1-MAV0OOrGz3+MR2fzhkjWaX11yQM=
dependencies:
minimist "0.0.8"
ms@^2.1.1:
version "2.1.2"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.2.tgz#d09d1f357b443f493382a8eb3ccd183872ae6009"
integrity sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w==
node-fetch@2.6.1:
version "2.6.1"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.1.tgz#045bd323631f76ed2e2b55573394416b639a0052"
integrity sha512-V4aYg89jEoVRxRb2fJdAg8FHvI7cEyYdVAh94HH0UIK8oJxUfkjlDQN9RbMx+bEjP7+ggMiFRprSti032Oipxw==
normalize-url@^4.1.0:
version "4.5.0"
resolved "https://registry.yarnpkg.com/normalize-url/-/normalize-url-4.5.0.tgz#453354087e6ca96957bd8f5baf753f5982142129"
integrity sha512-2s47yzUxdexf1OhyRi4Em83iQk0aPvwTddtFz4hnSSw9dCEsLEGf6SwIO8ss/19S9iBb5sJaOuTvTGDeZI00BQ==
once@^1.3.1, once@^1.4.0:
version "1.4.0"
resolved "https://registry.yarnpkg.com/once/-/once-1.4.0.tgz#583b1aa775961d4b113ac17d9c50baef9dd76bd1"
integrity sha1-WDsap3WWHUsROsF9nFC6753Xa9E=
dependencies:
wrappy "1"
p-cancelable@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/p-cancelable/-/p-cancelable-2.0.0.tgz#4a3740f5bdaf5ed5d7c3e34882c6fb5d6b266a6e"
integrity sha512-wvPXDmbMmu2ksjkB4Z3nZWTSkJEb9lqVdMaCKpZUGJG9TMiNp9XcbG3fn9fPKjem04fJMJnXoyFPk2FmgiaiNg==
parse-github-url@1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/parse-github-url/-/parse-github-url-1.0.2.tgz#242d3b65cbcdda14bb50439e3242acf6971db395"
integrity sha512-kgBf6avCbO3Cn6+RnzRGLkUsv4ZVqv/VfAYkRsyBcgkshNvVBkRn1FEZcW0Jb+npXQWm2vHPnnOqFteZxRRGNw==
pump@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/pump/-/pump-3.0.0.tgz#b4a2116815bde2f4e1ea602354e8c75565107a64"
integrity sha512-LwZy+p3SFs1Pytd/jYct4wpv49HiYCqd9Rlc5ZVdk0V+8Yzv6jR5Blk3TRmPL1ft69TxP0IMZGJ+WPFU2BFhww==
dependencies:
end-of-stream "^1.1.0"
once "^1.3.1"
readable-stream@^3.0.1, readable-stream@^3.1.1:
version "3.4.0"
resolved "https://registry.yarnpkg.com/readable-stream/-/readable-stream-3.4.0.tgz#a51c26754658e0a3c21dbf59163bd45ba6f447fc"
integrity sha512-jItXPLmrSR8jmTRmRWJXCnGJsfy85mB3Wd/uINMXA65yrnFo0cPClFIUWzo2najVNSl+mx7/4W8ttlLWJe99pQ==
dependencies:
inherits "^2.0.3"
string_decoder "^1.1.1"
util-deprecate "^1.0.1"
responselike@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/responselike/-/responselike-2.0.0.tgz#26391bcc3174f750f9a79eacc40a12a5c42d7723"
integrity sha512-xH48u3FTB9VsZw7R+vvgaKeLKzT6jOogbQhEe/jewwnZgzPcnyWui2Av6JpoYZF/91uueC+lqhWqeURw5/qhCw==
dependencies:
lowercase-keys "^2.0.0"
safe-buffer@~5.2.0:
version "5.2.0"
resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.2.0.tgz#b74daec49b1148f88c64b68d49b1e815c1f2f519"
integrity sha512-fZEwUGbVl7kouZs1jCdMLdt95hdIv0ZeHg6L7qPeciMZhZ+/gdesW4wgTARkrFWEpspjEATAzUGPG8N2jJiwbg==
source-map-support@^0.5.17:
version "0.5.19"
resolved "https://registry.yarnpkg.com/source-map-support/-/source-map-support-0.5.19.tgz#a98b62f86dcaf4f67399648c085291ab9e8fed61"
integrity sha512-Wonm7zOCIJzBGQdB+thsPar0kYuCIzYvxZwlBa87yi/Mdjv7Tip2cyVbLj5o0cFPN4EVkuTwb3GDDyUx2DGnGw==
dependencies:
buffer-from "^1.0.0"
source-map "^0.6.0"
source-map@^0.6.0:
version "0.6.1"
resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.6.1.tgz#74722af32e9614e9c287a8d0bbde48b5e2f1a263"
integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==
string_decoder@^1.1.1:
version "1.3.0"
resolved "https://registry.yarnpkg.com/string_decoder/-/string_decoder-1.3.0.tgz#42f114594a46cf1a8e30b0a84f56c78c3edac21e"
integrity sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==
dependencies:
safe-buffer "~5.2.0"
tar-fs@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/tar-fs/-/tar-fs-2.0.0.tgz#677700fc0c8b337a78bee3623fdc235f21d7afad"
integrity sha512-vaY0obB6Om/fso8a8vakQBzwholQ7v5+uy+tF3Ozvxv1KNezmVQAiWtcNmMHFSFPqL3dJA8ha6gdtFbfX9mcxA==
dependencies:
chownr "^1.1.1"
mkdirp "^0.5.1"
pump "^3.0.0"
tar-stream "^2.0.0"
tar-stream@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/tar-stream/-/tar-stream-2.1.0.tgz#d1aaa3661f05b38b5acc9b7020efdca5179a2cc3"
integrity sha512-+DAn4Nb4+gz6WZigRzKEZl1QuJVOLtAwwF+WUxy1fJ6X63CaGaUAxJRD2KEn1OMfcbCjySTYpNC6WmfQoIEOdw==
dependencies:
bl "^3.0.0"
end-of-stream "^1.4.1"
fs-constants "^1.0.0"
inherits "^2.0.3"
readable-stream "^3.1.1"
to-readable-stream@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/to-readable-stream/-/to-readable-stream-2.1.0.tgz#82880316121bea662cdc226adb30addb50cb06e8"
integrity sha512-o3Qa6DGg1CEXshSdvWNX2sN4QHqg03SPq7U6jPXRahlQdl5dK8oXjkU/2/sGrnOZKeGV1zLSO8qPwyKklPPE7w==
"traverse@>=0.3.0 <0.4":
version "0.3.9"
resolved "https://registry.yarnpkg.com/traverse/-/traverse-0.3.9.tgz#717b8f220cc0bb7b44e40514c22b2e8bbc70d8b9"
integrity sha1-cXuPIgzAu3tE5AUUwisui7xw2Lk=
ts-node@8.9.1:
version "8.9.1"
resolved "https://registry.yarnpkg.com/ts-node/-/ts-node-8.9.1.tgz#2f857f46c47e91dcd28a14e052482eb14cfd65a5"
integrity sha512-yrq6ODsxEFTLz0R3BX2myf0WBCSQh9A+py8PBo1dCzWIOcvisbyH6akNKqDHMgXePF2kir5mm5JXJTH3OUJYOQ==
dependencies:
arg "^4.1.0"
diff "^4.0.1"
make-error "^1.1.1"
source-map-support "^0.5.17"
yn "3.1.1"
tslib@^1.9.3:
version "1.10.0"
resolved "https://registry.yarnpkg.com/tslib/-/tslib-1.10.0.tgz#c3c19f95973fb0a62973fb09d90d961ee43e5c8a"
integrity sha512-qOebF53frne81cf0S9B41ByenJ3/IuH8yJKngAX35CmiZySA0khhkovshKK+jGCaMnVomla7gVlIcc3EvKPbTQ==
type-fest@^0.8.0:
version "0.8.1"
resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.8.1.tgz#09e249ebde851d3b1e48d27c105444667f17b83d"
integrity sha512-4dbzIzqvjtgiM5rw1k5rEHtBANKmdudhGyBEajN01fEyhaAIhsoKNy6y7+IN93IfpFtwY9iqi7kD+xwKhQsNJA==
typescript@3.9.3:
version "3.9.3"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.9.3.tgz#d3ac8883a97c26139e42df5e93eeece33d610b8a"
integrity sha512-D/wqnB2xzNFIcoBG9FG8cXRDjiqSTbG2wd8DMZeQyJlP1vfTkIxH4GKveWaEBYySKIg+USu+E+EDIR47SqnaMQ==
typescript@3.9.6:
version "3.9.6"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.9.6.tgz#8f3e0198a34c3ae17091b35571d3afd31999365a"
integrity sha512-Pspx3oKAPJtjNwE92YS05HQoY7z2SFyOpHo9MqJor3BXAGNaPUs83CuVp9VISFkSjyRfiTpmKuAYGJB7S7hOxw==
unzip-stream@0.3.0:
version "0.3.0"
resolved "https://registry.yarnpkg.com/unzip-stream/-/unzip-stream-0.3.0.tgz#c30c054cd6b0d64b13a23cd3ece911eb0b2b52d8"
integrity sha512-NG1h/MdGIX3HzyqMjyj1laBCmlPYhcO4xEy7gEqqzGiSLw7XqDQCnY4nYSn5XSaH8mQ6TFkaujrO8d/PIZN85A==
dependencies:
binary "^0.3.0"
mkdirp "^0.5.1"
util-deprecate@^1.0.1:
version "1.0.2"
resolved "https://registry.yarnpkg.com/util-deprecate/-/util-deprecate-1.0.2.tgz#450d4dc9fa70de732762fbd2d4a28981419a0ccf"
integrity sha1-RQ1Nyfpw3nMnYvvS1KKJgUGaDM8=
wrappy@1:
version "1.0.2"
resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"
integrity sha1-tSQ9jz7BqjXxNkYFvA0QNuMKtp8=
yn@3.1.1:
version "3.1.1"
resolved "https://registry.yarnpkg.com/yn/-/yn-3.1.1.tgz#1e87401a09d767c1d5eab26a6e4c185182d2eb50"
integrity sha512-Ux4ygGWsu2c7isFWe8Yu1YluJmqVhxqK2cLXNQA5AcC3QfbGNpM7fu0Y8b/z16pXLnFxZYvWhd3fhBY9DLmC6Q==

View File

@@ -1,8 +1,8 @@
# `@now/next` Legacy Mode # `@vercel/next` Legacy Mode
#### Why This Warning Occurred #### Why This Warning Occurred
`@now/next` has two modes: `legacy` and `serverless`. You will always want to use the `serverless` mode. `legacy` is to provide backwards compatibility with previous `@now/next` versions. `@vercel/next` has two modes: `legacy` and `serverless`. You will always want to use the `serverless` mode. `legacy` is to provide backwards compatibility with previous `@vercel/next` versions.
The differences: The differences:
@@ -63,7 +63,7 @@ module.exports = {
```js ```js
{ {
"version": 2, "version": 2,
"builds": [{ "src": "package.json", "use": "@now/next" }] "builds": [{ "src": "package.json", "use": "@vercel/next" }]
} }
``` ```

View File

@@ -1,4 +1,4 @@
# `@now/next` No Serverless Pages Built # `@vercel/next` No Serverless Pages Built
#### Why This Error Occurred #### Why This Error Occurred
@@ -33,13 +33,13 @@ module.exports = {
}; };
``` ```
4. Remove `distDir` from `next.config.js` as `@now/next` can't parse this file and expects your build output at `/.next` 4. Remove `distDir` from `next.config.js` as `@vercel/next` can't parse this file and expects your build output at `/.next`
5. Optionally make sure the `"src"` in `"builds"` points to your application `package.json` 5. Optionally make sure the `"src"` in `"builds"` points to your application `package.json`
```js ```js
{ {
"version": 2, "version": 2,
"builds": [{ "src": "package.json", "use": "@now/next" }] "builds": [{ "src": "package.json", "use": "@vercel/next" }]
} }
``` ```

View File

@@ -2,7 +2,7 @@
This directory is a brief example of a [Gatsby](https://www.gatsbyjs.org/) app that can be deployed to Vercel with zero configuration. This directory is a brief example of a [Gatsby](https://www.gatsbyjs.org/) app that can be deployed to Vercel with zero configuration.
> **Note:** We do not currently support some Gatsby v5 features, including API Routes and DSG. We are actively working on adding support for these features. > **Note:** SSR, DSG, and API Routes [are now supported](https://vercel.com/changelog/improved-support-for-gatsby-sites). We do not currently support some Gatsby v5 features, including Partial Hydration and the Slice API.
## Deploy Your Own ## Deploy Your Own

View File

@@ -9,5 +9,8 @@
}, },
"dependencies": { "dependencies": {
"gridsome": "0.7.23" "gridsome": "0.7.23"
},
"engines": {
"node": "<17"
} }
} }

View File

@@ -8,6 +8,8 @@ First, run the development server:
npm run dev npm run dev
# or # or
yarn dev yarn dev
# or
pnpm dev
``` ```
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.

View File

@@ -0,0 +1,8 @@
{
"compilerOptions": {
"baseUrl": ".",
"paths": {
"@/*": ["./*"]
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -9,10 +9,10 @@
"lint": "next lint" "lint": "next lint"
}, },
"dependencies": { "dependencies": {
"@next/font": "13.1.1", "@next/font": "13.1.5",
"eslint": "8.30.0", "eslint": "8.32.0",
"eslint-config-next": "13.1.1", "eslint-config-next": "13.1.5",
"next": "13.1.1", "next": "13.1.5",
"react": "18.2.0", "react": "18.2.0",
"react-dom": "18.2.0" "react-dom": "18.2.0"
} }

View File

@@ -1,4 +1,4 @@
import '../styles/globals.css' import '@/styles/globals.css'
export default function App({ Component, pageProps }) { export default function App({ Component, pageProps }) {
return <Component {...pageProps} /> return <Component {...pageProps} />

View File

@@ -1,7 +1,7 @@
import Head from 'next/head' import Head from 'next/head'
import Image from 'next/image' import Image from 'next/image'
import { Inter } from '@next/font/google' import { Inter } from '@next/font/google'
import styles from '../styles/Home.module.css' import styles from '@/styles/Home.module.css'
const inter = Inter({ subsets: ['latin'] }) const inter = Inter({ subsets: ['latin'] })

View File

@@ -7,6 +7,6 @@
"build": "parcel build" "build": "parcel build"
}, },
"devDependencies": { "devDependencies": {
"parcel": "^2.0.0" "parcel": "^2.8.3"
} }
} }

File diff suppressed because it is too large Load Diff

View File

@@ -10,7 +10,7 @@
"@remix-run/react": "^1.7.6", "@remix-run/react": "^1.7.6",
"@remix-run/vercel": "^1.7.6", "@remix-run/vercel": "^1.7.6",
"@vercel/analytics": "^0.1.5", "@vercel/analytics": "^0.1.5",
"@vercel/node": "^2.6.3", "@vercel/node": "^2.7.0",
"react": "^18.2.0", "react": "^18.2.0",
"react-dom": "^18.2.0" "react-dom": "^18.2.0"
}, },
@@ -20,7 +20,7 @@
"@remix-run/serve": "^1.7.6", "@remix-run/serve": "^1.7.6",
"@types/react": "^18.0.25", "@types/react": "^18.0.25",
"@types/react-dom": "^18.0.9", "@types/react-dom": "^18.0.9",
"eslint": "^8.27.0", "eslint": "^8.28.0",
"typescript": "^4.9.3" "typescript": "^4.9.3"
}, },
"engines": { "engines": {

File diff suppressed because it is too large Load Diff

View File

@@ -12,7 +12,7 @@
}, },
"devDependencies": { "devDependencies": {
"@sveltejs/adapter-auto": "next", "@sveltejs/adapter-auto": "next",
"@sveltejs/kit": "next", "@sveltejs/kit": "1.0.0-next.589",
"@types/cookie": "^0.5.1", "@types/cookie": "^0.5.1",
"prettier": "^2.6.2", "prettier": "^2.6.2",
"prettier-plugin-svelte": "^2.7.0", "prettier-plugin-svelte": "^2.7.0",

View File

@@ -1,5 +1,5 @@
{ {
"npmClient": "yarn", "npmClient": "pnpm",
"useWorkspaces": true, "useWorkspaces": true,
"packages": ["packages/*"], "packages": ["packages/*"],
"command": { "command": {

View File

@@ -3,30 +3,25 @@
"version": "0.0.0", "version": "0.0.0",
"private": true, "private": true,
"license": "Apache-2.0", "license": "Apache-2.0",
"packageManager": "yarn@1.22.19", "packageManager": "pnpm@7.24.2",
"workspaces": {
"packages": [
"packages/*"
],
"nohoist": [
"**/@types/**",
"**/typedoc",
"**/typedoc-plugin-markdown",
"**/typedoc-plugin-mdn-links"
]
},
"dependencies": { "dependencies": {
"lerna": "3.16.4" "lerna": "5.6.2"
}, },
"devDependencies": { "devDependencies": {
"@types/node": "14.18.33",
"@typescript-eslint/eslint-plugin": "5.21.0", "@typescript-eslint/eslint-plugin": "5.21.0",
"@typescript-eslint/parser": "5.21.0", "@typescript-eslint/parser": "5.21.0",
"@vercel/build-utils": "*",
"@vercel/ncc": "0.24.0",
"async-retry": "1.2.3", "async-retry": "1.2.3",
"buffer-replace": "1.0.0", "buffer-replace": "1.0.0",
"create-svelte": "2.0.1", "create-svelte": "2.0.1",
"dot": "1.1.3",
"eslint": "8.14.0", "eslint": "8.14.0",
"eslint-config-prettier": "8.5.0", "eslint-config-prettier": "8.5.0",
"eslint-plugin-jest": "26.1.5", "eslint-plugin-jest": "26.1.5",
"execa": "3.2.0",
"fs-extra": "11.1.0",
"husky": "7.0.4", "husky": "7.0.4",
"jest": "28.0.2", "jest": "28.0.2",
"json5": "2.1.1", "json5": "2.1.1",
@@ -34,22 +29,24 @@
"node-fetch": "2.6.7", "node-fetch": "2.6.7",
"npm-package-arg": "6.1.0", "npm-package-arg": "6.1.0",
"prettier": "2.6.2", "prettier": "2.6.2",
"source-map-support": "0.5.12",
"ts-eager": "2.0.2", "ts-eager": "2.0.2",
"ts-jest": "28.0.5", "ts-jest": "28.0.5",
"turbo": "1.6.3" "turbo": "1.7.0"
}, },
"scripts": { "scripts": {
"lerna": "lerna", "lerna": "lerna",
"version": "pnpm install && git add pnpm-lock.yaml",
"bootstrap": "lerna bootstrap", "bootstrap": "lerna bootstrap",
"publish-stable": "echo 'Run `yarn changelog` for instructions'", "publish-stable": "echo 'Run `pnpm changelog` for instructions'",
"publish-canary": "git checkout main && git pull && lerna version prerelease --preid canary --message \"Publish Canary\" --exact", "publish-canary": "git checkout main && git pull && lerna version prerelease --preid canary --message \"Publish Canary\" --exact",
"publish-from-github": "./utils/publish.sh", "publish-from-github": "./utils/publish.sh",
"changelog": "node utils/changelog.js", "changelog": "node utils/changelog.js",
"build": "node utils/gen.js && turbo run build", "build": "node utils/gen.js && turbo run build",
"vercel-build": "yarn build && yarn run pack && cd api && node -r ts-eager/register ./_lib/script/build.ts", "vercel-build": "pnpm build && pnpm run pack && cd api && node -r ts-eager/register ./_lib/script/build.ts",
"pre-commit": "lint-staged", "pre-commit": "lint-staged",
"test": "jest --rootDir=\"test\" --testPathPattern=\"\\.test.js\"", "test": "jest --rootDir=\"test\" --testPathPattern=\"\\.test.js\"",
"test-unit": "yarn test && node utils/gen.js && turbo run test-unit", "test-unit": "pnpm test && node utils/gen.js && turbo run test-unit",
"test-integration-cli": "node utils/gen.js && turbo run test-integration-cli", "test-integration-cli": "node utils/gen.js && turbo run test-integration-cli",
"test-integration-once": "node utils/gen.js && turbo run test-integration-once", "test-integration-once": "node utils/gen.js && turbo run test-integration-once",
"test-integration-dev": "node utils/gen.js && turbo run test-integration-dev", "test-integration-dev": "node utils/gen.js && turbo run test-integration-dev",

View File

@@ -1,6 +1,6 @@
{ {
"name": "@vercel/build-utils", "name": "@vercel/build-utils",
"version": "5.7.4", "version": "6.0.1",
"license": "MIT", "license": "MIT",
"main": "./dist/index.js", "main": "./dist/index.js",
"types": "./dist/index.d.js", "types": "./dist/index.d.js",
@@ -13,8 +13,8 @@
"scripts": { "scripts": {
"build": "node build", "build": "node build",
"test": "jest --env node --verbose --runInBand --bail", "test": "jest --env node --verbose --runInBand --bail",
"test-unit": "yarn test test/unit.*test.*", "test-unit": "pnpm test test/unit.*test.*",
"test-integration-once": "yarn test test/integration.test.ts" "test-integration-once": "pnpm test test/integration.test.ts"
}, },
"devDependencies": { "devDependencies": {
"@iarna/toml": "2.2.3", "@iarna/toml": "2.2.3",
@@ -25,8 +25,10 @@
"@types/glob": "7.2.0", "@types/glob": "7.2.0",
"@types/jest": "27.4.1", "@types/jest": "27.4.1",
"@types/js-yaml": "3.12.1", "@types/js-yaml": "3.12.1",
"@types/minimatch": "^5.1.2",
"@types/ms": "0.7.31", "@types/ms": "0.7.31",
"@types/multistream": "2.1.1", "@types/multistream": "2.1.1",
"@types/node": "14.18.33",
"@types/node-fetch": "^2.1.6", "@types/node-fetch": "^2.1.6",
"@types/semver": "6.0.0", "@types/semver": "6.0.0",
"@types/yazl": "2.4.2", "@types/yazl": "2.4.2",
@@ -36,8 +38,10 @@
"async-sema": "2.1.4", "async-sema": "2.1.4",
"cross-spawn": "6.0.5", "cross-spawn": "6.0.5",
"end-of-stream": "1.4.1", "end-of-stream": "1.4.1",
"execa": "3.2.0",
"fs-extra": "10.0.0", "fs-extra": "10.0.0",
"glob": "8.0.3", "glob": "8.0.3",
"ignore": "4.0.6",
"into-stream": "5.0.0", "into-stream": "5.0.0",
"js-yaml": "3.13.1", "js-yaml": "3.13.1",
"minimatch": "3.0.4", "minimatch": "3.0.4",

View File

@@ -2,15 +2,20 @@ import path from 'path';
import debug from '../debug'; import debug from '../debug';
import FileFsRef from '../file-fs-ref'; import FileFsRef from '../file-fs-ref';
import { File, Files, Meta } from '../types'; import { File, Files, Meta } from '../types';
import { remove, mkdirp, readlink, symlink } from 'fs-extra'; import { remove, mkdirp, readlink, symlink, chmod } from 'fs-extra';
import streamToBuffer from './stream-to-buffer'; import streamToBuffer from './stream-to-buffer';
export interface DownloadedFiles { export interface DownloadedFiles {
[filePath: string]: FileFsRef; [filePath: string]: FileFsRef;
} }
const S_IFMT = 61440; /* 0170000 type of file */ const S_IFDIR = 16384; /* 0040000 directory */
const S_IFLNK = 40960; /* 0120000 symbolic link */ const S_IFLNK = 40960; /* 0120000 symbolic link */
const S_IFMT = 61440; /* 0170000 type of file */
export function isDirectory(mode: number): boolean {
return (mode & S_IFMT) === S_IFDIR;
}
export function isSymbolicLink(mode: number): boolean { export function isSymbolicLink(mode: number): boolean {
return (mode & S_IFMT) === S_IFLNK; return (mode & S_IFMT) === S_IFLNK;
@@ -46,6 +51,12 @@ export async function downloadFile(
): Promise<FileFsRef> { ): Promise<FileFsRef> {
const { mode } = file; const { mode } = file;
if (isDirectory(mode)) {
await mkdirp(fsPath);
await chmod(fsPath, mode);
return FileFsRef.fromFsPath({ mode, fsPath });
}
// If the source is a symlink, try to create it instead of copying the file. // If the source is a symlink, try to create it instead of copying the file.
// Note: creating symlinks on Windows requires admin priviliges or symlinks // Note: creating symlinks on Windows requires admin priviliges or symlinks
// enabled in the group policy. We may want to improve the error message. // enabled in the group policy. We may want to improve the error message.

View File

@@ -6,7 +6,9 @@ import { lstat, Stats } from 'fs-extra';
import { normalizePath } from './normalize-path'; import { normalizePath } from './normalize-path';
import FileFsRef from '../file-fs-ref'; import FileFsRef from '../file-fs-ref';
export type GlobOptions = vanillaGlob_.IOptions; export interface GlobOptions extends vanillaGlob_.IOptions {
includeDirectories?: boolean;
}
const vanillaGlob = promisify(vanillaGlob_); const vanillaGlob = promisify(vanillaGlob_);
@@ -15,12 +17,7 @@ export default async function glob(
opts: GlobOptions | string, opts: GlobOptions | string,
mountpoint?: string mountpoint?: string
): Promise<Record<string, FileFsRef>> { ): Promise<Record<string, FileFsRef>> {
let options: GlobOptions; const options = typeof opts === 'string' ? { cwd: opts } : opts;
if (typeof opts === 'string') {
options = { cwd: opts };
} else {
options = opts;
}
if (!options.cwd) { if (!options.cwd) {
throw new Error( throw new Error(
@@ -34,13 +31,18 @@ export default async function glob(
const results: Record<string, FileFsRef> = {}; const results: Record<string, FileFsRef> = {};
const statCache: Record<string, Stats> = {}; const statCache: Record<string, Stats> = {};
const symlinks: Record<string, boolean | undefined> = {};
options.symlinks = {}; const files = await vanillaGlob(pattern, {
options.statCache = statCache; ...options,
options.stat = true; symlinks,
options.dot = true; statCache,
stat: true,
dot: true,
});
const files = await vanillaGlob(pattern, options); const dirs = new Set<string>();
const dirsWithEntries = new Set<string>();
for (const relativePath of files) { for (const relativePath of files) {
const fsPath = normalizePath(path.join(options.cwd, relativePath)); const fsPath = normalizePath(path.join(options.cwd, relativePath));
@@ -49,12 +51,20 @@ export default async function glob(
stat, stat,
`statCache does not contain value for ${relativePath} (resolved to ${fsPath})` `statCache does not contain value for ${relativePath} (resolved to ${fsPath})`
); );
const isSymlink = options.symlinks![fsPath]; const isSymlink = symlinks[fsPath];
if (isSymlink || stat.isFile()) { if (isSymlink || stat.isFile() || stat.isDirectory()) {
if (isSymlink) { if (isSymlink) {
stat = await lstat(fsPath); stat = await lstat(fsPath);
} }
// Some bookkeeping to track which directories already have entries within
const dirname = path.dirname(relativePath);
dirsWithEntries.add(dirname);
if (stat.isDirectory()) {
dirs.add(relativePath);
continue;
}
let finalPath = relativePath; let finalPath = relativePath;
if (mountpoint) { if (mountpoint) {
finalPath = path.join(mountpoint, finalPath); finalPath = path.join(mountpoint, finalPath);
@@ -64,5 +74,22 @@ export default async function glob(
} }
} }
// Add empty directory entries
if (options.includeDirectories) {
for (const relativePath of dirs) {
if (dirsWithEntries.has(relativePath)) continue;
let finalPath = relativePath;
if (mountpoint) {
finalPath = path.join(mountpoint, finalPath);
}
const fsPath = normalizePath(path.join(options.cwd, relativePath));
const stat = statCache[fsPath];
results[finalPath] = new FileFsRef({ mode: stat.mode, fsPath });
}
}
return results; return results;
} }

View File

@@ -1,12 +1,17 @@
import { Files } from '../types'; import { Files } from '../types';
type Delegate = (name: string) => string; type Delegate = (name: string) => string;
/**
* Renames the keys of a `Files` map.
*
* @param files A map of filenames to `File` instances
* @param delegate A function that returns the new filename
* @returns A new file map with the renamed filenames
*/
export default function rename(files: Files, delegate: Delegate): Files { export default function rename(files: Files, delegate: Delegate): Files {
return Object.keys(files).reduce( const result: Files = {};
(newFiles, name) => ({ for (const [name, file] of Object.entries(files)) {
...newFiles, result[delegate(name)] = file;
[delegate(name)]: files[name], }
}), return result;
{}
);
} }

View File

@@ -111,53 +111,6 @@ export function spawnAsync(
}); });
} }
export function execAsync(
command: string,
args: string[],
opts: SpawnOptionsExtended = {}
) {
return new Promise<{ stdout: string; stderr: string; code: number }>(
(resolve, reject) => {
opts.stdio = 'pipe';
const stdoutList: Buffer[] = [];
const stderrList: Buffer[] = [];
const child = spawn(command, args, opts);
child.stderr!.on('data', data => {
stderrList.push(data);
});
child.stdout!.on('data', data => {
stdoutList.push(data);
});
child.on('error', reject);
child.on('close', (code, signal) => {
if (code === 0 || opts.ignoreNon0Exit) {
return resolve({
code,
stdout: Buffer.concat(stdoutList).toString(),
stderr: Buffer.concat(stderrList).toString(),
});
}
const cmd = opts.prettyCommand
? `Command "${opts.prettyCommand}"`
: 'Command';
return reject(
new NowBuildError({
code: `BUILD_UTILS_EXEC_${code || signal}`,
message: `${cmd} exited with ${code || signal}`,
})
);
});
}
);
}
export function spawnCommand(command: string, options: SpawnOptions = {}) { export function spawnCommand(command: string, options: SpawnOptions = {}) {
const opts = { ...options, prettyCommand: command }; const opts = { ...options, prettyCommand: command };
if (process.platform === 'win32') { if (process.platform === 'win32') {

View File

@@ -8,12 +8,12 @@ import download, {
downloadFile, downloadFile,
DownloadedFiles, DownloadedFiles,
isSymbolicLink, isSymbolicLink,
isDirectory,
} from './fs/download'; } from './fs/download';
import getWriteableDirectory from './fs/get-writable-directory'; import getWriteableDirectory from './fs/get-writable-directory';
import glob, { GlobOptions } from './fs/glob'; import glob, { GlobOptions } from './fs/glob';
import rename from './fs/rename'; import rename from './fs/rename';
import { import {
execAsync,
spawnAsync, spawnAsync,
execCommand, execCommand,
spawnCommand, spawnCommand,
@@ -58,7 +58,6 @@ export {
glob, glob,
GlobOptions, GlobOptions,
rename, rename,
execAsync,
spawnAsync, spawnAsync,
getScriptName, getScriptName,
installDependencies, installDependencies,
@@ -82,6 +81,7 @@ export {
streamToBuffer, streamToBuffer,
debug, debug,
isSymbolicLink, isSymbolicLink,
isDirectory,
getLambdaOptionsFromFunction, getLambdaOptionsFromFunction,
scanParentDirs, scanParentDirs,
getIgnoreFilter, getIgnoreFilter,

View File

@@ -3,7 +3,7 @@ import Sema from 'async-sema';
import { ZipFile } from 'yazl'; import { ZipFile } from 'yazl';
import minimatch from 'minimatch'; import minimatch from 'minimatch';
import { readlink } from 'fs-extra'; import { readlink } from 'fs-extra';
import { isSymbolicLink } from './fs/download'; import { isSymbolicLink, isDirectory } from './fs/download';
import streamToBuffer from './fs/stream-to-buffer'; import streamToBuffer from './fs/stream-to-buffer';
import type { Files, Config } from './types'; import type { Files, Config } from './types';
@@ -200,6 +200,8 @@ export async function createZip(files: Files): Promise<Buffer> {
const symlinkTarget = symlinkTargets.get(name); const symlinkTarget = symlinkTargets.get(name);
if (typeof symlinkTarget === 'string') { if (typeof symlinkTarget === 'string') {
zipFile.addBuffer(Buffer.from(symlinkTarget, 'utf8'), name, opts); zipFile.addBuffer(Buffer.from(symlinkTarget, 'utf8'), name, opts);
} else if (file.mode && isDirectory(file.mode)) {
zipFile.addEmptyDirectory(name, opts);
} else { } else {
const stream = file.toStream(); const stream = file.toStream();
stream.on('error', reject); stream.on('error', reject);

View File

@@ -7,6 +7,9 @@
"dependencies": { "dependencies": {
"react": "16.8.0", "react": "16.8.0",
"swr": "1.3.0" "swr": "1.3.0"
},
"engines": {
"node": "16.x"
} }
}, },
"node_modules/js-tokens": { "node_modules/js-tokens": {

View File

@@ -1,5 +1,5 @@
{ {
"name": "a", "name": "a21",
"version": "1.0.0", "version": "1.0.0",
"description": "", "description": "",
"main": "index.js", "main": "index.js",

View File

@@ -1,5 +1,5 @@
{ {
"name": "b", "name": "b21",
"version": "1.0.0", "version": "1.0.0",
"description": "", "description": "",
"main": "index.js", "main": "index.js",

View File

@@ -1,5 +1,5 @@
{ {
"name": "21-npm-workspaces", "private": true,
"version": "1.0.0", "version": "1.0.0",
"workspaces": [ "workspaces": [
"a", "a",

View File

@@ -1,14 +1,8 @@
{ {
"name": "22-pnpm", "private": true,
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": { "scripts": {
"build": "mkdir -p public && (printf \"pnpm version: \" && pnpm -v) > public/index.txt" "build": "mkdir -p public && (printf \"pnpm version: \" && pnpm -v) > public/index.txt"
}, },
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": { "dependencies": {
"once": "^1.4.0" "once": "^1.4.0"
} }

View File

@@ -1,5 +1,5 @@
{ {
"name": "c", "name": "build-c23",
"license": "MIT", "license": "MIT",
"version": "0.1.0" "version": "0.1.0"
} }

View File

@@ -1,5 +1,5 @@
{ {
"name": "d", "name": "build-d23",
"license": "MIT", "license": "MIT",
"version": "0.1.0", "version": "0.1.0",
"devDependencies": { "devDependencies": {

View File

@@ -1,6 +1,5 @@
{ {
"private": "true", "private": true,
"name": "24-pnpm-hoisted",
"scripts": { "scripts": {
"build": "ls -Al node_modules && node index.js" "build": "ls -Al node_modules && node index.js"
}, },

View File

@@ -1,5 +1,5 @@
{ {
"name": "a", "name": "build-a25",
"version": "1.0.0", "version": "1.0.0",
"description": "", "description": "",
"main": "index.js", "main": "index.js",

View File

@@ -1,5 +1,5 @@
{ {
"name": "b", "name": "build-b25",
"version": "1.0.0", "version": "1.0.0",
"description": "", "description": "",
"main": "index.js", "main": "index.js",

View File

@@ -1,6 +1,5 @@
{ {
"private": "true", "private": true,
"name": "25-multiple-lock-files-yarn",
"workspaces": [ "workspaces": [
"a", "a",
"b" "b"

View File

@@ -1,5 +1,5 @@
{ {
"name": "a", "name": "build-a26",
"version": "1.0.0", "version": "1.0.0",
"description": "", "description": "",
"main": "index.js", "main": "index.js",

View File

@@ -1,5 +1,5 @@
{ {
"name": "b", "name": "build-b26",
"version": "1.0.0", "version": "1.0.0",
"description": "", "description": "",
"main": "index.js", "main": "index.js",

View File

@@ -1,6 +1,5 @@
{ {
"private": "true", "private": true,
"name": "26-multiple-lock-files-pnpm",
"workspaces": [ "workspaces": [
"a", "a",
"b" "b"

View File

@@ -0,0 +1,195 @@
import path from 'path';
import fs, { readlink } from 'fs-extra';
import { strict as assert, strictEqual } from 'assert';
import { download, glob, FileBlob } from '../src';
describe('download()', () => {
let warningMessages: string[];
const originalConsoleWarn = console.warn;
beforeEach(() => {
warningMessages = [];
console.warn = m => {
warningMessages.push(m);
};
});
afterEach(() => {
console.warn = originalConsoleWarn;
});
it('should re-create FileFsRef symlinks properly', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = await glob('**', path.join(__dirname, 'symlinks'));
assert.equal(Object.keys(files).length, 4);
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
const files2 = await download(files, outDir);
assert.equal(Object.keys(files2).length, 4);
const [linkStat, linkDirStat, aStat] = await Promise.all([
fs.lstat(path.join(outDir, 'link.txt')),
fs.lstat(path.join(outDir, 'link-dir')),
fs.lstat(path.join(outDir, 'a.txt')),
]);
assert(linkStat.isSymbolicLink());
assert(linkDirStat.isSymbolicLink());
assert(aStat.isFile());
const [linkDirContents, linkTextContents] = await Promise.all([
readlink(path.join(outDir, 'link-dir')),
readlink(path.join(outDir, 'link.txt')),
]);
strictEqual(linkDirContents, 'dir');
strictEqual(linkTextContents, './a.txt');
});
it('should re-create FileBlob symlinks properly', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = {
'a.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'a text',
}),
'dir/b.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'b text',
}),
'link-dir': new FileBlob({
mode: 41453,
contentType: undefined,
data: 'dir',
}),
'link.txt': new FileBlob({
mode: 41453,
contentType: undefined,
data: 'a.txt',
}),
};
strictEqual(Object.keys(files).length, 4);
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
const files2 = await download(files, outDir);
strictEqual(Object.keys(files2).length, 4);
const [linkStat, linkDirStat, aStat, dirStat] = await Promise.all([
fs.lstat(path.join(outDir, 'link.txt')),
fs.lstat(path.join(outDir, 'link-dir')),
fs.lstat(path.join(outDir, 'a.txt')),
fs.lstat(path.join(outDir, 'dir')),
]);
assert(linkStat.isSymbolicLink());
assert(linkDirStat.isSymbolicLink());
assert(aStat.isFile());
assert(dirStat.isDirectory());
const [linkDirContents, linkTextContents] = await Promise.all([
readlink(path.join(outDir, 'link-dir')),
readlink(path.join(outDir, 'link.txt')),
]);
strictEqual(linkDirContents, 'dir');
strictEqual(linkTextContents, 'a.txt');
});
it('should download symlinks even with incorrect file', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = {
'dir/file.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'file text',
}),
linkdir: new FileBlob({
mode: 41453,
contentType: undefined,
data: 'dir',
}),
'linkdir/file.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'this file should be discarded',
}),
};
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
await fs.mkdirp(outDir);
await download(files, outDir);
const [dir, file, linkdir] = await Promise.all([
fs.lstat(path.join(outDir, 'dir')),
fs.lstat(path.join(outDir, 'dir/file.txt')),
fs.lstat(path.join(outDir, 'linkdir')),
]);
expect(dir.isFile()).toBe(false);
expect(dir.isSymbolicLink()).toBe(false);
expect(file.isFile()).toBe(true);
expect(file.isSymbolicLink()).toBe(false);
expect(linkdir.isSymbolicLink()).toBe(true);
expect(warningMessages).toEqual([
'Warning: file "linkdir/file.txt" is within a symlinked directory "linkdir" and will be ignored',
]);
});
it('should create empty directory entries', async () => {
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
const files = {
'empty-dir': new FileBlob({
mode: 16877, // drwxr-xr-x
contentType: undefined,
data: '',
}),
dir: new FileBlob({
mode: 16877,
contentType: undefined,
data: '',
}),
'dir/subdir': new FileBlob({
mode: 16877,
contentType: undefined,
data: '',
}),
'another/subdir': new FileBlob({
mode: 16895, // drwxrwxrwx
contentType: undefined,
data: '',
}),
};
await download(files, outDir);
for (const [p, f] of Object.entries(files)) {
const stat = await fs.lstat(path.join(outDir, p));
expect(stat.isDirectory()).toEqual(true);
if (process.platform !== 'win32') {
// Don't test Windows since it doesn't support the same permissions
expect(stat.mode).toEqual(f.mode);
}
}
});
});

View File

@@ -1,29 +0,0 @@
import { execAsync, NowBuildError } from '../src';
it('should execute a command', async () => {
const { code, stdout, stderr } = await execAsync('echo', ['hello']);
expect(code).toBe(0);
expect(stdout).toContain('hello');
expect(stderr).toBe('');
});
it('should throw if the command exits with non-0 code', async () => {
await expect(execAsync('find', ['unknown-file'])).rejects.toBeInstanceOf(
NowBuildError
);
});
it('should return if the command exits with non-0 code and ignoreNon0Exit=true', async () => {
const { code, stdout, stderr } = await execAsync('find', ['unknown-file'], {
ignoreNon0Exit: true,
});
expect(code).toBe(process.platform === 'win32' ? 2 : 1);
expect(stdout).toBe('');
expect(stderr).toContain(
process.platform === 'win32'
? 'Parameter format not correct'
: 'No such file or directory'
);
});

View File

@@ -0,0 +1,65 @@
import fs from 'fs-extra';
import { join } from 'path';
import { tmpdir } from 'os';
import { glob, isDirectory } from '../src';
describe('glob()', () => {
it('should not return entries for empty directories by default', async () => {
const dir = await fs.mkdtemp(join(tmpdir(), 'build-utils-test'));
try {
await Promise.all([
fs.writeFile(join(dir, 'root.txt'), 'file at the root'),
fs.mkdirp(join(dir, 'empty-dir')),
fs
.mkdirp(join(dir, 'dir-with-file'))
.then(() =>
fs.writeFile(join(dir, 'dir-with-file/data.json'), '{"a":"b"}')
),
fs.mkdirp(join(dir, 'another/subdir')),
]);
const files = await glob('**', dir);
const fileNames = Object.keys(files).sort();
expect(fileNames).toHaveLength(2);
expect(fileNames).toEqual(['dir-with-file/data.json', 'root.txt']);
expect(isDirectory(files['dir-with-file/data.json'].mode)).toEqual(false);
expect(isDirectory(files['root.txt'].mode)).toEqual(false);
expect(files['dir-with-file']).toBeUndefined();
expect(files['another/subdir']).toBeUndefined();
expect(files['empty-dir']).toBeUndefined();
} finally {
await fs.remove(dir);
}
});
it('should return entries for empty directories with `includeDirectories: true`', async () => {
const dir = await fs.mkdtemp(join(tmpdir(), 'build-utils-test'));
try {
await Promise.all([
fs.writeFile(join(dir, 'root.txt'), 'file at the root'),
fs.mkdirp(join(dir, 'empty-dir')),
fs
.mkdirp(join(dir, 'dir-with-file'))
.then(() =>
fs.writeFile(join(dir, 'dir-with-file/data.json'), '{"a":"b"}')
),
fs.mkdirp(join(dir, 'another/subdir')),
]);
const files = await glob('**', { cwd: dir, includeDirectories: true });
const fileNames = Object.keys(files).sort();
expect(fileNames).toHaveLength(4);
expect(fileNames).toEqual([
'another/subdir',
'dir-with-file/data.json',
'empty-dir',
'root.txt',
]);
expect(isDirectory(files['another/subdir'].mode)).toEqual(true);
expect(isDirectory(files['empty-dir'].mode)).toEqual(true);
expect(isDirectory(files['dir-with-file/data.json'].mode)).toEqual(false);
expect(isDirectory(files['root.txt'].mode)).toEqual(false);
expect(files['dir-with-file']).toBeUndefined();
} finally {
await fs.remove(dir);
}
});
});

View File

@@ -0,0 +1,97 @@
import path from 'path';
import { tmpdir } from 'os';
import fs from 'fs-extra';
import { createZip } from '../src/lambda';
import { FileBlob, glob, spawnAsync } from '../src';
const MODE_DIRECTORY = 16877; /* drwxr-xr-x */
const MODE_FILE = 33188; /* -rw-r--r-- */
describe('Lambda', () => {
it('should create zip file with symlinks', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = await glob('**', path.join(__dirname, 'symlinks'));
expect(Object.keys(files)).toHaveLength(4);
const outFile = path.join(__dirname, 'symlinks.zip');
await fs.remove(outFile);
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
await fs.mkdirp(outDir);
await fs.writeFile(outFile, await createZip(files));
await spawnAsync('unzip', [outFile], { cwd: outDir });
const [linkStat, linkDirStat, aStat] = await Promise.all([
fs.lstat(path.join(outDir, 'link.txt')),
fs.lstat(path.join(outDir, 'link-dir')),
fs.lstat(path.join(outDir, 'a.txt')),
]);
expect(linkStat.isSymbolicLink()).toEqual(true);
expect(linkDirStat.isSymbolicLink()).toEqual(true);
expect(aStat.isFile()).toEqual(true);
});
it('should create zip file with empty directory', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const dir = await fs.mkdtemp(path.join(tmpdir(), 'create-zip-empty-dir'));
try {
const files = {
a: new FileBlob({
data: 'contents',
mode: MODE_FILE,
}),
empty: new FileBlob({
data: '',
mode: MODE_DIRECTORY,
}),
'b/a': new FileBlob({
data: 'inside dir b',
mode: MODE_FILE,
}),
c: new FileBlob({
data: '',
mode: MODE_DIRECTORY,
}),
'c/a': new FileBlob({
data: 'inside dir c',
mode: MODE_FILE,
}),
};
const outFile = path.join(dir, 'lambda.zip');
const outDir = path.join(dir, 'out');
await fs.mkdirp(outDir);
await fs.writeFile(outFile, await createZip(files));
await spawnAsync('unzip', [outFile], { cwd: outDir });
expect(fs.statSync(path.join(outDir, 'empty')).isDirectory()).toEqual(
true
);
expect(fs.statSync(path.join(outDir, 'b')).isDirectory()).toEqual(true);
expect(fs.statSync(path.join(outDir, 'c')).isDirectory()).toEqual(true);
expect(fs.readFileSync(path.join(outDir, 'a'), 'utf8')).toEqual(
'contents'
);
expect(fs.readFileSync(path.join(outDir, 'b/a'), 'utf8')).toEqual(
'inside dir b'
);
expect(fs.readFileSync(path.join(outDir, 'c/a'), 'utf8')).toEqual(
'inside dir c'
);
expect(fs.readdirSync(path.join(outDir, 'empty'))).toHaveLength(0);
} finally {
await fs.remove(dir);
}
});
});

View File

@@ -1,22 +1,20 @@
import ms from 'ms'; import ms from 'ms';
import path from 'path'; import path from 'path';
import fs, { readlink } from 'fs-extra'; import fs from 'fs-extra';
import { strict as assert, strictEqual } from 'assert'; import { strict as assert } from 'assert';
import { createZip } from '../src/lambda';
import { getSupportedNodeVersion } from '../src/fs/node-version'; import { getSupportedNodeVersion } from '../src/fs/node-version';
import download from '../src/fs/download';
import { import {
glob, FileBlob,
spawnAsync,
getNodeVersion, getNodeVersion,
getLatestNodeVersion, getLatestNodeVersion,
getDiscontinuedNodeVersions, getDiscontinuedNodeVersions,
rename,
runNpmInstall, runNpmInstall,
runPackageJsonScript, runPackageJsonScript,
scanParentDirs, scanParentDirs,
FileBlob,
Prerender, Prerender,
} from '../src'; } from '../src';
import type { Files } from '../src';
jest.setTimeout(10 * 1000); jest.setTimeout(10 * 1000);
@@ -51,171 +49,6 @@ afterEach(() => {
console.warn = originalConsoleWarn; console.warn = originalConsoleWarn;
}); });
it('should re-create FileFsRef symlinks properly', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = await glob('**', path.join(__dirname, 'symlinks'));
assert.equal(Object.keys(files).length, 4);
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
const files2 = await download(files, outDir);
assert.equal(Object.keys(files2).length, 4);
const [linkStat, linkDirStat, aStat] = await Promise.all([
fs.lstat(path.join(outDir, 'link.txt')),
fs.lstat(path.join(outDir, 'link-dir')),
fs.lstat(path.join(outDir, 'a.txt')),
]);
assert(linkStat.isSymbolicLink());
assert(linkDirStat.isSymbolicLink());
assert(aStat.isFile());
const [linkDirContents, linkTextContents] = await Promise.all([
readlink(path.join(outDir, 'link-dir')),
readlink(path.join(outDir, 'link.txt')),
]);
strictEqual(linkDirContents, 'dir');
strictEqual(linkTextContents, './a.txt');
});
it('should re-create FileBlob symlinks properly', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = {
'a.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'a text',
}),
'dir/b.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'b text',
}),
'link-dir': new FileBlob({
mode: 41453,
contentType: undefined,
data: 'dir',
}),
'link.txt': new FileBlob({
mode: 41453,
contentType: undefined,
data: 'a.txt',
}),
};
strictEqual(Object.keys(files).length, 4);
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
const files2 = await download(files, outDir);
strictEqual(Object.keys(files2).length, 4);
const [linkStat, linkDirStat, aStat, dirStat] = await Promise.all([
fs.lstat(path.join(outDir, 'link.txt')),
fs.lstat(path.join(outDir, 'link-dir')),
fs.lstat(path.join(outDir, 'a.txt')),
fs.lstat(path.join(outDir, 'dir')),
]);
assert(linkStat.isSymbolicLink());
assert(linkDirStat.isSymbolicLink());
assert(aStat.isFile());
assert(dirStat.isDirectory());
const [linkDirContents, linkTextContents] = await Promise.all([
readlink(path.join(outDir, 'link-dir')),
readlink(path.join(outDir, 'link.txt')),
]);
strictEqual(linkDirContents, 'dir');
strictEqual(linkTextContents, 'a.txt');
});
it('should create zip files with symlinks properly', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = await glob('**', path.join(__dirname, 'symlinks'));
assert.equal(Object.keys(files).length, 4);
const outFile = path.join(__dirname, 'symlinks.zip');
await fs.remove(outFile);
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
await fs.mkdirp(outDir);
await fs.writeFile(outFile, await createZip(files));
await spawnAsync('unzip', [outFile], { cwd: outDir });
const [linkStat, linkDirStat, aStat] = await Promise.all([
fs.lstat(path.join(outDir, 'link.txt')),
fs.lstat(path.join(outDir, 'link-dir')),
fs.lstat(path.join(outDir, 'a.txt')),
]);
assert(linkStat.isSymbolicLink());
assert(linkDirStat.isSymbolicLink());
assert(aStat.isFile());
});
it('should download symlinks even with incorrect file', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = {
'dir/file.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'file text',
}),
linkdir: new FileBlob({
mode: 41453,
contentType: undefined,
data: 'dir',
}),
'linkdir/file.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'this file should be discarded',
}),
};
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
await fs.mkdirp(outDir);
await download(files, outDir);
const [dir, file, linkdir] = await Promise.all([
fs.lstat(path.join(outDir, 'dir')),
fs.lstat(path.join(outDir, 'dir/file.txt')),
fs.lstat(path.join(outDir, 'linkdir')),
]);
expect(dir.isFile()).toBe(false);
expect(dir.isSymbolicLink()).toBe(false);
expect(file.isFile()).toBe(true);
expect(file.isSymbolicLink()).toBe(false);
expect(linkdir.isSymbolicLink()).toBe(true);
expect(warningMessages).toEqual([
'Warning: file "linkdir/file.txt" is within a symlinked directory "linkdir" and will be ignored',
]);
});
it('should only match supported node versions, otherwise throw an error', async () => { it('should only match supported node versions, otherwise throw an error', async () => {
expect(await getSupportedNodeVersion('14.x', false)).toHaveProperty( expect(await getSupportedNodeVersion('14.x', false)).toHaveProperty(
'major', 'major',
@@ -474,21 +307,21 @@ it('should support initialHeaders and initialStatus correctly', async () => {
}); });
it('should support require by path for legacy builders', () => { it('should support require by path for legacy builders', () => {
const index = require('@vercel/build-utils'); const index = require('../');
const download2 = require('@vercel/build-utils/fs/download.js'); const download2 = require('../fs/download.js');
const getWriteableDirectory2 = require('@vercel/build-utils/fs/get-writable-directory.js'); const getWriteableDirectory2 = require('../fs/get-writable-directory.js');
const glob2 = require('@vercel/build-utils/fs/glob.js'); const glob2 = require('../fs/glob.js');
const rename2 = require('@vercel/build-utils/fs/rename.js'); const rename2 = require('../fs/rename.js');
const { const {
runNpmInstall: runNpmInstall2, runNpmInstall: runNpmInstall2,
} = require('@vercel/build-utils/fs/run-user-scripts.js'); } = require('../fs/run-user-scripts.js');
const streamToBuffer2 = require('@vercel/build-utils/fs/stream-to-buffer.js'); const streamToBuffer2 = require('../fs/stream-to-buffer.js');
const FileBlob2 = require('@vercel/build-utils/file-blob.js'); const FileBlob2 = require('../file-blob.js');
const FileFsRef2 = require('@vercel/build-utils/file-fs-ref.js'); const FileFsRef2 = require('../file-fs-ref.js');
const FileRef2 = require('@vercel/build-utils/file-ref.js'); const FileRef2 = require('../file-ref.js');
const { Lambda: Lambda2 } = require('@vercel/build-utils/lambda.js'); const { Lambda: Lambda2 } = require('../lambda.js');
expect(download2).toBe(index.download); expect(download2).toBe(index.download);
expect(getWriteableDirectory2).toBe(index.getWriteableDirectory); expect(getWriteableDirectory2).toBe(index.getWriteableDirectory);
@@ -590,9 +423,9 @@ it('should detect package.json in nested backend', async () => {
'../../node/test/fixtures/18.1-nested-packagejson/backend' '../../node/test/fixtures/18.1-nested-packagejson/backend'
); );
const result = await scanParentDirs(fixture); const result = await scanParentDirs(fixture);
expect(result.cliType).toEqual('yarn'); expect(result.cliType).toEqual('pnpm');
expect(result.lockfileVersion).toEqual(undefined); // There is no lockfile but this test will pick up vercel/vercel/pnpm-lock.yaml
// There is no lockfile but this test will pick up vercel/vercel/yarn.lock expect(result.lockfileVersion).toEqual(5.4);
expect(result.packageJsonPath).toEqual(path.join(fixture, 'package.json')); expect(result.packageJsonPath).toEqual(path.join(fixture, 'package.json'));
}); });
@@ -602,9 +435,9 @@ it('should detect package.json in nested frontend', async () => {
'../../node/test/fixtures/18.1-nested-packagejson/frontend' '../../node/test/fixtures/18.1-nested-packagejson/frontend'
); );
const result = await scanParentDirs(fixture); const result = await scanParentDirs(fixture);
expect(result.cliType).toEqual('yarn'); expect(result.cliType).toEqual('pnpm');
expect(result.lockfileVersion).toEqual(undefined); // There is no lockfile but this test will pick up vercel/vercel/pnpm-lock.yaml
// There is no lockfile but this test will pick up vercel/vercel/yarn.lock expect(result.lockfileVersion).toEqual(5.4);
expect(result.packageJsonPath).toEqual(path.join(fixture, 'package.json')); expect(result.packageJsonPath).toEqual(path.join(fixture, 'package.json'));
}); });
@@ -621,3 +454,18 @@ it('should retry npm install when peer deps invalid and npm@8 on node@16', async
'Warning: Retrying "Install Command" with `--legacy-peer-deps` which may accept a potentially broken dependency and slow install time.', 'Warning: Retrying "Install Command" with `--legacy-peer-deps` which may accept a potentially broken dependency and slow install time.',
]); ]);
}); });
describe('rename', () => {
it('should rename keys of files map', () => {
const before: Files = {};
const toUpper = (s: string) => s.toUpperCase();
for (let i = 97; i <= 122; i++) {
const key = String.fromCharCode(i);
before[key] = new FileBlob({ contentType: 'text/plain', data: key });
}
const after = rename(before, toUpper);
expect(Object.keys(after)).toEqual('ABCDEFGHIJKLMNOPQRSTUVWXYZ'.split(''));
});
});

View File

@@ -2,7 +2,7 @@ import { walkParentDirs } from '../src';
import { strict } from 'assert'; import { strict } from 'assert';
import { join } from 'path'; import { join } from 'path';
import { promises } from 'fs'; import { promises } from 'fs';
const { deepEqual, notDeepEqual, fail } = strict; const { notDeepEqual, fail } = strict;
const { readFile } = promises; const { readFile } = promises;
const fixture = (name: string) => join(__dirname, 'walk', name); const fixture = (name: string) => join(__dirname, 'walk', name);
const filename = 'file.txt'; const filename = 'file.txt';
@@ -10,7 +10,7 @@ const filename = 'file.txt';
async function assertContent(target: string | null, contents: string) { async function assertContent(target: string | null, contents: string) {
notDeepEqual(target, null); notDeepEqual(target, null);
const actual = await readFile(target!, 'utf8'); const actual = await readFile(target!, 'utf8');
deepEqual(actual.trim(), contents.trim()); strict.deepEqual(actual.trim(), contents.trim());
} }
describe('Test `walkParentDirs`', () => { describe('Test `walkParentDirs`', () => {
@@ -21,7 +21,7 @@ describe('Test `walkParentDirs`', () => {
await walkParentDirs({ base, start, filename }); await walkParentDirs({ base, start, filename });
fail('Expected error'); fail('Expected error');
} catch (error) { } catch (error) {
deepEqual( strict.deepEqual(
(error as Error).message, (error as Error).message,
'Expected "base" to be absolute path' 'Expected "base" to be absolute path'
); );
@@ -35,7 +35,7 @@ describe('Test `walkParentDirs`', () => {
await walkParentDirs({ base, start, filename }); await walkParentDirs({ base, start, filename });
fail('Expected error'); fail('Expected error');
} catch (error) { } catch (error) {
deepEqual( strict.deepEqual(
(error as Error).message, (error as Error).message,
'Expected "start" to be absolute path' 'Expected "start" to be absolute path'
); );
@@ -67,21 +67,21 @@ describe('Test `walkParentDirs`', () => {
const base = fixture('not-found'); const base = fixture('not-found');
const start = base; const start = base;
const target = await walkParentDirs({ base, start, filename }); const target = await walkParentDirs({ base, start, filename });
deepEqual(target, null); strict.deepEqual(target, null);
}); });
it('should not find nested two', async () => { it('should not find nested two', async () => {
const base = fixture('not-found'); const base = fixture('not-found');
const start = join(base, 'two'); const start = join(base, 'two');
const target = await walkParentDirs({ base, start, filename }); const target = await walkParentDirs({ base, start, filename });
deepEqual(target, null); strict.deepEqual(target, null);
}); });
it('should not find nested three', async () => { it('should not find nested three', async () => {
const base = fixture('not-found'); const base = fixture('not-found');
const start = join(base, 'two', 'three'); const start = join(base, 'two', 'three');
const target = await walkParentDirs({ base, start, filename }); const target = await walkParentDirs({ base, start, filename });
deepEqual(target, null); strict.deepEqual(target, null);
}); });
it('should find only one', async () => { it('should find only one', async () => {

View File

@@ -41,8 +41,8 @@ To develop Vercel CLI, first check out the source code, install dependencies, an
```bash ```bash
git clone https://github.com/vercel/vercel.git git clone https://github.com/vercel/vercel.git
cd vercel cd vercel
yarn pnpm install
yarn build pnpm build
``` ```
At this point you can make modifications to the CLI source code and test them out locally. The CLI source code is located in the `packages/cli` directory. At this point you can make modifications to the CLI source code and test them out locally. The CLI source code is located in the `packages/cli` directory.
@@ -51,15 +51,15 @@ At this point you can make modifications to the CLI source code and test them ou
cd packages/cli cd packages/cli
``` ```
### `yarn dev <cli-commands...>` ### `pnpm dev <cli-commands...>`
From within the `packages/cli` directory, you can use the "dev" script to quickly execute Vercel CLI from its TypeScript source code directly (without having to manually compile first). For example: From within the `packages/cli` directory, you can use the "dev" script to quickly execute Vercel CLI from its TypeScript source code directly (without having to manually compile first). For example:
```bash ```bash
yarn dev deploy pnpm dev deploy
yarn dev whoami pnpm dev whoami
yarn dev login pnpm dev login
yarn dev switch --debug pnpm dev switch --debug
``` ```
When you are satisfied with your changes, make a commit and create a pull request! When you are satisfied with your changes, make a commit and create a pull request!

View File

@@ -1,6 +1,6 @@
{ {
"name": "vercel", "name": "vercel",
"version": "28.11.0", "version": "28.14.1",
"preferGlobal": true, "preferGlobal": true,
"license": "Apache-2.0", "license": "Apache-2.0",
"description": "The command-line interface for Vercel", "description": "The command-line interface for Vercel",
@@ -13,9 +13,9 @@
"scripts": { "scripts": {
"preinstall": "node ./scripts/preinstall.js", "preinstall": "node ./scripts/preinstall.js",
"test": "jest --env node --verbose --bail", "test": "jest --env node --verbose --bail",
"test-unit": "yarn test test/unit/", "test-unit": "pnpm test test/unit/",
"test-integration-cli": "rimraf test/fixtures/integration && ava test/integration.js --serial --fail-fast --verbose", "test-integration-cli": "rimraf test/fixtures/integration && ava test/integration.js --serial --fail-fast --verbose",
"test-integration-dev": "yarn test test/dev/", "test-integration-dev": "pnpm test test/dev/",
"coverage": "codecov", "coverage": "codecov",
"build": "ts-node ./scripts/build.ts", "build": "ts-node ./scripts/build.ts",
"dev": "ts-node ./src/index.ts" "dev": "ts-node ./src/index.ts"
@@ -41,17 +41,16 @@
"node": ">= 14" "node": ">= 14"
}, },
"dependencies": { "dependencies": {
"@vercel/build-utils": "5.7.4", "@vercel/build-utils": "6.0.1",
"@vercel/go": "2.2.23", "@vercel/go": "2.3.0",
"@vercel/hydrogen": "0.0.37", "@vercel/hydrogen": "0.0.46",
"@vercel/next": "3.3.8", "@vercel/next": "3.3.20",
"@vercel/node": "2.8.5", "@vercel/node": "2.8.17",
"@vercel/python": "3.1.33", "@vercel/python": "3.1.42",
"@vercel/redwood": "1.0.44", "@vercel/redwood": "1.0.53",
"@vercel/remix": "1.1.6", "@vercel/remix": "1.2.9",
"@vercel/ruby": "1.3.49", "@vercel/ruby": "1.3.58",
"@vercel/static-build": "1.1.0", "@vercel/static-build": "1.3.1"
"update-notifier": "5.1.0"
}, },
"devDependencies": { "devDependencies": {
"@alex_neo/jest-expect-message": "1.0.5", "@alex_neo/jest-expect-message": "1.0.5",
@@ -84,6 +83,7 @@
"@types/npm-package-arg": "6.1.0", "@types/npm-package-arg": "6.1.0",
"@types/pluralize": "0.0.29", "@types/pluralize": "0.0.29",
"@types/psl": "1.1.0", "@types/psl": "1.1.0",
"@types/qs": "6.9.7",
"@types/semver": "6.0.1", "@types/semver": "6.0.1",
"@types/tar-fs": "1.16.1", "@types/tar-fs": "1.16.1",
"@types/text-table": "0.2.0", "@types/text-table": "0.2.0",
@@ -93,12 +93,13 @@
"@types/which": "1.3.2", "@types/which": "1.3.2",
"@types/write-json-file": "2.2.1", "@types/write-json-file": "2.2.1",
"@types/yauzl-promise": "2.1.0", "@types/yauzl-promise": "2.1.0",
"@vercel/client": "12.2.25", "@vercel/client": "12.3.4",
"@vercel/error-utils": "1.0.3", "@vercel/error-utils": "1.0.8",
"@vercel/frameworks": "1.1.18", "@vercel/frameworks": "1.3.0",
"@vercel/fs-detectors": "3.6.1", "@vercel/fs-detectors": "3.7.7",
"@vercel/fun": "1.0.4", "@vercel/fun": "1.0.4",
"@vercel/ncc": "0.24.0", "@vercel/ncc": "0.24.0",
"@vercel/routing-utils": "2.1.8",
"@zeit/source-map-support": "0.6.2", "@zeit/source-map-support": "0.6.2",
"ajv": "6.12.2", "ajv": "6.12.2",
"alpha-sort": "2.0.1", "alpha-sort": "2.0.1",
@@ -138,6 +139,7 @@
"is-port-reachable": "3.1.0", "is-port-reachable": "3.1.0",
"is-url": "1.2.2", "is-url": "1.2.2",
"jaro-winkler": "0.2.8", "jaro-winkler": "0.2.8",
"jest-matcher-utils": "29.3.1",
"json5": "2.2.1", "json5": "2.2.1",
"jsonlines": "0.1.1", "jsonlines": "0.1.1",
"line-async-iterator": "3.0.0", "line-async-iterator": "3.0.0",
@@ -169,7 +171,7 @@
"tmp-promise": "1.0.3", "tmp-promise": "1.0.3",
"tree-kill": "1.2.2", "tree-kill": "1.2.2",
"ts-node": "10.9.1", "ts-node": "10.9.1",
"typescript": "4.7.4", "typescript": "4.9.4",
"universal-analytics": "0.4.20", "universal-analytics": "0.4.20",
"utility-types": "2.1.0", "utility-types": "2.1.0",
"write-json-file": "2.2.0", "write-json-file": "2.2.0",

View File

@@ -1,7 +1,7 @@
import cpy from 'cpy'; import cpy from 'cpy';
import execa from 'execa'; import execa from 'execa';
import { join } from 'path'; import { join } from 'path';
import { remove, writeFile } from 'fs-extra'; import { remove, readJSON, writeFile } from 'fs-extra';
const dirRoot = join(__dirname, '..'); const dirRoot = join(__dirname, '..');
const distRoot = join(dirRoot, 'dist'); const distRoot = join(dirRoot, 'dist');
@@ -43,16 +43,16 @@ async function main() {
stdio: 'inherit', stdio: 'inherit',
}); });
const pkg = await readJSON(join(dirRoot, 'package.json'));
const dependencies = Object.keys(pkg?.dependencies ?? {});
// Do the initial `ncc` build // Do the initial `ncc` build
console.log(); console.log('Dependencies:', dependencies);
const args = [ const externs: Array<string> = [];
'ncc', for (const dep of dependencies) {
'build', externs.push('--external', dep);
'--external', }
'update-notifier', const args = ['ncc', 'build', 'src/index.ts', ...externs];
'src/index.ts', await execa('pnpm', args, { stdio: 'inherit', cwd: dirRoot });
];
await execa('yarn', args, { stdio: 'inherit', cwd: dirRoot });
// `ncc` has some issues with `@vercel/fun`'s runtime files: // `ncc` has some issues with `@vercel/fun`'s runtime files:
// - Executable bits on the `bootstrap` files appear to be lost: // - Executable bits on the `bootstrap` files appear to be lost:
@@ -66,10 +66,7 @@ async function main() {
// get compiled into the final ncc bundle file, however, we want them to be // get compiled into the final ncc bundle file, however, we want them to be
// present in the npm package because the contents of those files are involved // present in the npm package because the contents of those files are involved
// with `fun`'s cache invalidation mechanism and they need to be shasum'd. // with `fun`'s cache invalidation mechanism and they need to be shasum'd.
const runtimes = join( const runtimes = join(dirRoot, 'node_modules/@vercel/fun/dist/src/runtimes');
dirRoot,
'../../node_modules/@vercel/fun/dist/src/runtimes'
);
await cpy('**/*', join(distRoot, 'runtimes'), { await cpy('**/*', join(distRoot, 'runtimes'), {
parents: true, parents: true,
cwd: runtimes, cwd: runtimes,
@@ -78,6 +75,10 @@ async function main() {
// Band-aid to bundle stuff that `ncc` neglects to bundle // Band-aid to bundle stuff that `ncc` neglects to bundle
await cpy(join(dirRoot, 'src/util/projects/VERCEL_DIR_README.txt'), distRoot); await cpy(join(dirRoot, 'src/util/projects/VERCEL_DIR_README.txt'), distRoot);
await cpy(join(dirRoot, 'src/util/dev/builder-worker.js'), distRoot); await cpy(join(dirRoot, 'src/util/dev/builder-worker.js'), distRoot);
await cpy(
join(dirRoot, 'src/util/get-latest-version/get-latest-worker.js'),
distRoot
);
console.log('Finished building Vercel CLI'); console.log('Finished building Vercel CLI');
} }

View File

@@ -51,7 +51,7 @@ export default async function ls(
...paginationOptions ...paginationOptions
); );
output.log(`aliases found under ${chalk.bold(contextName)} ${lsStamp()}`); output.log(`aliases found under ${chalk.bold(contextName)} ${lsStamp()}`);
output.log(printAliasTable(aliases)); client.stdout.write(printAliasTable(aliases));
if (pagination && pagination.count === 20) { if (pagination && pagination.count === 20) {
const flags = getCommandFlags(opts, ['_', '--next']); const flags = getCommandFlags(opts, ['_', '--next']);

View File

@@ -6,7 +6,7 @@ import { Output } from '../../util/output';
import * as ERRORS from '../../util/errors-ts'; import * as ERRORS from '../../util/errors-ts';
import assignAlias from '../../util/alias/assign-alias'; import assignAlias from '../../util/alias/assign-alias';
import Client from '../../util/client'; import Client from '../../util/client';
import getDeploymentByIdOrHost from '../../util/deploy/get-deployment-by-id-or-host'; import getDeployment from '../../util/get-deployment';
import { getDeploymentForAlias } from '../../util/alias/get-deployment-by-alias'; import { getDeploymentForAlias } from '../../util/alias/get-deployment-by-alias';
import getScope from '../../util/get-scope'; import getScope from '../../util/get-scope';
import setupDomain from '../../util/domains/setup-domain'; import setupDomain from '../../util/domains/setup-domain';
@@ -136,36 +136,13 @@ export default async function set(
const [deploymentIdOrHost, aliasTarget] = args; const [deploymentIdOrHost, aliasTarget] = args;
const deployment = handleCertError( const deployment = handleCertError(
output, output,
await getDeploymentByIdOrHost(client, contextName, deploymentIdOrHost) await getDeployment(client, contextName, deploymentIdOrHost)
); );
if (deployment === 1) { if (deployment === 1) {
return deployment; return deployment;
} }
if (deployment instanceof ERRORS.DeploymentNotFound) {
output.error(
`Failed to find deployment "${deployment.meta.id}" under ${chalk.bold(
contextName
)}`
);
return 1;
}
if (deployment instanceof ERRORS.DeploymentPermissionDenied) {
output.error(
`No permission to access deployment "${
deployment.meta.id
}" under ${chalk.bold(deployment.meta.context)}`
);
return 1;
}
if (deployment instanceof ERRORS.InvalidDeploymentId) {
output.error(deployment.message);
return 1;
}
if (deployment === null) { if (deployment === null) {
output.error( output.error(
`Couldn't find a deployment to alias. Please provide one as an argument.` `Couldn't find a deployment to alias. Please provide one as an argument.`

View File

@@ -15,17 +15,11 @@ import Client from '../../util/client';
import { getPkgName } from '../../util/pkg-name'; import { getPkgName } from '../../util/pkg-name';
import { Deployment, PaginationOptions } from '../../types'; import { Deployment, PaginationOptions } from '../../types';
import { normalizeURL } from '../../util/bisect/normalize-url'; import { normalizeURL } from '../../util/bisect/normalize-url';
import getScope from '../../util/get-scope';
interface DeploymentV6 import getDeployment from '../../util/get-deployment';
extends Pick<
Deployment,
'url' | 'target' | 'projectId' | 'ownerId' | 'meta' | 'inspectorUrl'
> {
createdAt: number;
}
interface Deployments { interface Deployments {
deployments: DeploymentV6[]; deployments: Deployment[];
pagination: PaginationOptions; pagination: PaginationOptions;
} }
@@ -63,6 +57,8 @@ const help = () => {
export default async function main(client: Client): Promise<number> { export default async function main(client: Client): Promise<number> {
const { output } = client; const { output } = client;
const scope = await getScope(client);
const { contextName } = scope;
const argv = getArgs(client.argv.slice(2), { const argv = getArgs(client.argv.slice(2), {
'--bad': String, '--bad': String,
@@ -145,7 +141,9 @@ export default async function main(client: Client): Promise<number> {
output.spinner('Retrieving deployments…'); output.spinner('Retrieving deployments…');
// `getDeployment` cannot be parallelized because it might prompt for login // `getDeployment` cannot be parallelized because it might prompt for login
const badDeployment = await getDeployment(client, bad).catch(err => err); const badDeployment = await getDeployment(client, contextName, bad).catch(
err => err
);
if (badDeployment) { if (badDeployment) {
if (badDeployment instanceof Error) { if (badDeployment instanceof Error) {
@@ -162,7 +160,9 @@ export default async function main(client: Client): Promise<number> {
} }
// `getDeployment` cannot be parallelized because it might prompt for login // `getDeployment` cannot be parallelized because it might prompt for login
const goodDeployment = await getDeployment(client, good).catch(err => err); const goodDeployment = await getDeployment(client, contextName, good).catch(
err => err
);
if (goodDeployment) { if (goodDeployment) {
if (goodDeployment instanceof Error) { if (goodDeployment instanceof Error) {
@@ -204,7 +204,7 @@ export default async function main(client: Client): Promise<number> {
} }
// Fetch all the project's "READY" deployments with the pagination API // Fetch all the project's "READY" deployments with the pagination API
let deployments: DeploymentV6[] = []; let deployments: Deployment[] = [];
let next: number | undefined = badDeployment.createdAt + 1; let next: number | undefined = badDeployment.createdAt + 1;
do { do {
const query = new URLSearchParams(); const query = new URLSearchParams();
@@ -279,7 +279,7 @@ export default async function main(client: Client): Promise<number> {
const commit = getCommit(deployment); const commit = getCommit(deployment);
if (commit) { if (commit) {
const shortSha = commit.sha.substring(0, 7); const shortSha = commit.sha.substring(0, 7);
const firstLine = commit.message.split('\n')[0]; const firstLine = commit.message?.split('\n')[0];
output.log(`${chalk.bold('Commit:')} [${shortSha}] ${firstLine}`); output.log(`${chalk.bold('Commit:')} [${shortSha}] ${firstLine}`);
} }
@@ -356,7 +356,7 @@ export default async function main(client: Client): Promise<number> {
const commit = getCommit(lastBad); const commit = getCommit(lastBad);
if (commit) { if (commit) {
const shortSha = commit.sha.substring(0, 7); const shortSha = commit.sha.substring(0, 7);
const firstLine = commit.message.split('\n')[0]; const firstLine = commit.message?.split('\n')[0];
result.push(` ${chalk.bold('Commit:')} [${shortSha}] ${firstLine}`); result.push(` ${chalk.bold('Commit:')} [${shortSha}] ${firstLine}`);
} }
@@ -368,18 +368,7 @@ export default async function main(client: Client): Promise<number> {
return 0; return 0;
} }
function getDeployment( function getCommit(deployment: Deployment) {
client: Client,
hostname: string
): Promise<DeploymentV6> {
const query = new URLSearchParams();
query.set('url', hostname);
query.set('resolve', '1');
query.set('noState', '1');
return client.fetch<DeploymentV6>(`/v10/deployments/get?${query}`);
}
function getCommit(deployment: DeploymentV6) {
const sha = const sha =
deployment.meta?.githubCommitSha || deployment.meta?.githubCommitSha ||
deployment.meta?.gitlabCommitSha || deployment.meta?.gitlabCommitSha ||

View File

@@ -17,7 +17,11 @@ import {
BuildResultV3, BuildResultV3,
NowBuildError, NowBuildError,
} from '@vercel/build-utils'; } from '@vercel/build-utils';
import { detectBuilders } from '@vercel/fs-detectors'; import {
detectBuilders,
detectFrameworkRecord,
LocalFileSystemDetector,
} from '@vercel/fs-detectors';
import minimatch from 'minimatch'; import minimatch from 'minimatch';
import { import {
appendRoutesToPhase, appendRoutesToPhase,
@@ -59,6 +63,9 @@ import { toEnumerableError } from '../util/error';
import { validateConfig } from '../util/validate-config'; import { validateConfig } from '../util/validate-config';
import { setMonorepoDefaultSettings } from '../util/build/monorepo'; import { setMonorepoDefaultSettings } from '../util/build/monorepo';
import frameworks from '@vercel/frameworks';
import { detectFrameworkVersion } from '@vercel/fs-detectors';
import semver from 'semver';
type BuildResult = BuildResultV2 | BuildResultV3; type BuildResult = BuildResultV2 | BuildResultV3;
@@ -69,6 +76,20 @@ interface SerializedBuilder extends Builder {
apiVersion: number; apiVersion: number;
} }
/**
* Build Output API `config.json` file interface.
*/
interface BuildOutputConfig {
version?: 3;
wildcard?: BuildResultV2Typical['wildcard'];
images?: BuildResultV2Typical['images'];
routes?: BuildResultV2Typical['routes'];
overrides?: Record<string, PathOverride>;
framework?: {
version: string;
};
}
/** /**
* Contents of the `builds.json` file. * Contents of the `builds.json` file.
*/ */
@@ -434,7 +455,7 @@ async function doBuild(
// Execute Builders for detected entrypoints // Execute Builders for detected entrypoints
// TODO: parallelize builds (except for frontend) // TODO: parallelize builds (except for frontend)
const sortedBuilders = sortBuilders(builds); const sortedBuilders = sortBuilders(builds);
const buildResults: Map<Builder, BuildResult> = new Map(); const buildResults: Map<Builder, BuildResult | BuildOutputConfig> = new Map();
const overrides: PathOverride[] = []; const overrides: PathOverride[] = [];
const repoRootPath = cwd; const repoRootPath = cwd;
const corepackShimDir = await initCorepack({ repoRootPath }); const corepackShimDir = await initCorepack({ repoRootPath });
@@ -538,8 +559,7 @@ async function doBuild(
// Merge existing `config.json` file into the one that will be produced // Merge existing `config.json` file into the one that will be produced
const configPath = join(outputDir, 'config.json'); const configPath = join(outputDir, 'config.json');
// TODO: properly type const existingConfig = await readJSONFile<BuildOutputConfig>(configPath);
const existingConfig = await readJSONFile<any>(configPath);
if (existingConfig instanceof CantParseJSONFile) { if (existingConfig instanceof CantParseJSONFile) {
throw existingConfig; throw existingConfig;
} }
@@ -585,15 +605,17 @@ async function doBuild(
const mergedOverrides: Record<string, PathOverride> = const mergedOverrides: Record<string, PathOverride> =
overrides.length > 0 ? Object.assign({}, ...overrides) : undefined; overrides.length > 0 ? Object.assign({}, ...overrides) : undefined;
const framework = await getFramework(cwd, buildResults);
// Write out the final `config.json` file based on the // Write out the final `config.json` file based on the
// user configuration and Builder build results // user configuration and Builder build results
// TODO: properly type const config: BuildOutputConfig = {
const config = {
version: 3, version: 3,
routes: mergedRoutes, routes: mergedRoutes,
images: mergedImages, images: mergedImages,
wildcard: mergedWildcard, wildcard: mergedWildcard,
overrides: mergedOverrides, overrides: mergedOverrides,
framework,
}; };
await fs.writeJSON(join(outputDir, 'config.json'), config, { spaces: 2 }); await fs.writeJSON(join(outputDir, 'config.json'), config, { spaces: 2 });
@@ -608,6 +630,50 @@ async function doBuild(
); );
} }
async function getFramework(
cwd: string,
buildResults: Map<Builder, BuildResult | BuildOutputConfig>
): Promise<{ version: string } | undefined> {
const detectedFramework = await detectFrameworkRecord({
fs: new LocalFileSystemDetector(cwd),
frameworkList: frameworks,
});
if (!detectedFramework) {
return;
}
// determine framework version from build result
if (detectedFramework.useRuntime) {
for (const [build, buildResult] of buildResults.entries()) {
if (
'framework' in buildResult &&
build.use === detectedFramework.useRuntime.use
) {
return buildResult.framework;
}
}
}
// determine framework version from listed package.json version
if (detectedFramework.detectedVersion) {
// check for a valid, explicit version, not a range
if (semver.valid(detectedFramework.detectedVersion)) {
return {
version: detectedFramework.detectedVersion,
};
}
}
// determine framework version with runtime lookup
const frameworkVersion = detectFrameworkVersion(detectedFramework);
if (frameworkVersion) {
return {
version: frameworkVersion,
};
}
}
function expandBuild(files: string[], build: Builder): Builder[] { function expandBuild(files: string[], build: Builder): Builder[] {
if (!build.use) { if (!build.use) {
throw new NowBuildError({ throw new NowBuildError({
@@ -648,7 +714,7 @@ function expandBuild(files: string[], build: Builder): Builder[] {
function mergeImages( function mergeImages(
images: BuildResultV2Typical['images'], images: BuildResultV2Typical['images'],
buildResults: Iterable<BuildResult> buildResults: Iterable<BuildResult | BuildOutputConfig>
): BuildResultV2Typical['images'] { ): BuildResultV2Typical['images'] {
for (const result of buildResults) { for (const result of buildResults) {
if ('images' in result && result.images) { if ('images' in result && result.images) {
@@ -659,7 +725,7 @@ function mergeImages(
} }
function mergeWildcard( function mergeWildcard(
buildResults: Iterable<BuildResult> buildResults: Iterable<BuildResult | BuildOutputConfig>
): BuildResultV2Typical['wildcard'] { ): BuildResultV2Typical['wildcard'] {
let wildcard: BuildResultV2Typical['wildcard'] = undefined; let wildcard: BuildResultV2Typical['wildcard'] = undefined;
for (const result of buildResults) { for (const result of buildResults) {

View File

@@ -55,7 +55,7 @@ async function ls(
); );
if (certs.length > 0) { if (certs.length > 0) {
output.log(formatCertsTable(certs)); client.stdout.write(formatCertsTable(certs));
} }
if (pagination && pagination.count === 20) { if (pagination && pagination.count === 20) {

View File

@@ -19,15 +19,13 @@ import toHumanPath from '../../util/humanize-path';
import Now from '../../util'; import Now from '../../util';
import stamp from '../../util/output/stamp'; import stamp from '../../util/output/stamp';
import createDeploy from '../../util/deploy/create-deploy'; import createDeploy from '../../util/deploy/create-deploy';
import getDeploymentByIdOrHost from '../../util/deploy/get-deployment-by-id-or-host'; import getDeployment from '../../util/get-deployment';
import parseMeta from '../../util/parse-meta'; import parseMeta from '../../util/parse-meta';
import linkStyle from '../../util/output/link'; import linkStyle from '../../util/output/link';
import param from '../../util/output/param'; import param from '../../util/output/param';
import { import {
BuildsRateLimited, BuildsRateLimited,
DeploymentNotFound, DeploymentNotFound,
DeploymentPermissionDenied,
InvalidDeploymentId,
DomainNotFound, DomainNotFound,
DomainNotVerified, DomainNotVerified,
DomainPermissionDenied, DomainPermissionDenied,
@@ -629,21 +627,8 @@ export default async (client: Client): Promise<number> => {
return 1; return 1;
} }
const deploymentResponse = await getDeploymentByIdOrHost( // get the deployment just to double check that it actually deployed
client, await getDeployment(client, contextName, deployment.id);
contextName,
deployment.id,
'v10'
);
if (
deploymentResponse instanceof DeploymentNotFound ||
deploymentResponse instanceof DeploymentPermissionDenied ||
deploymentResponse instanceof InvalidDeploymentId
) {
output.error(deploymentResponse.message);
return 1;
}
if (deployment === null) { if (deployment === null) {
error('Uploading failed. Please try again.'); error('Uploading failed. Please try again.');

View File

@@ -70,7 +70,7 @@ export default async function ls(
records.length > 0 ? 'Records' : 'No records' records.length > 0 ? 'Records' : 'No records'
} found under ${chalk.bold(contextName)} ${chalk.gray(lsStamp())}` } found under ${chalk.bold(contextName)} ${chalk.gray(lsStamp())}`
); );
output.log(getDNSRecordsTable([{ domainName, records }])); client.stdout.write(getDNSRecordsTable([{ domainName, records }]));
if (pagination && pagination.count === 20) { if (pagination && pagination.count === 20) {
const flags = getCommandFlags(opts, ['_', '--next']); const flags = getCommandFlags(opts, ['_', '--next']);

View File

@@ -34,7 +34,7 @@ const help = () => {
)} Connect your Vercel Project to your Git repository defined in your local .git config )} Connect your Vercel Project to your Git repository defined in your local .git config
${chalk.cyan(`$ ${getPkgName()} git connect`)} ${chalk.cyan(`$ ${getPkgName()} git connect`)}
${chalk.gray( ${chalk.gray(
'' ''
)} Connect your Vercel Project to a Git repository using the remote URL )} Connect your Vercel Project to a Git repository using the remote URL
@@ -96,6 +96,7 @@ export default async function main(client: Client) {
} }
const { org, project } = linkedProject; const { org, project } = linkedProject;
client.config.currentTeam = org.type === 'team' ? org.id : undefined;
switch (subcommand) { switch (subcommand) {
case 'connect': case 'connect':

View File

@@ -9,12 +9,10 @@ import { handleError } from '../util/error';
import getScope from '../util/get-scope'; import getScope from '../util/get-scope';
import { getPkgName, getCommandName } from '../util/pkg-name'; import { getPkgName, getCommandName } from '../util/pkg-name';
import Client from '../util/client'; import Client from '../util/client';
import { getDeployment } from '../util/get-deployment'; import getDeployment from '../util/get-deployment';
import { Deployment } from '@vercel/client'; import { Build, Deployment } from '../types';
import { Build } from '../types';
import title from 'title'; import title from 'title';
import { isErrnoException } from '@vercel/error-utils'; import { isErrnoException } from '@vercel/error-utils';
import { isAPIError } from '../util/errors-ts';
import { URL } from 'url'; import { URL } from 'url';
const help = () => { const help = () => {
@@ -49,7 +47,6 @@ const help = () => {
}; };
export default async function main(client: Client) { export default async function main(client: Client) {
let deployment;
let argv; let argv;
try { try {
@@ -101,30 +98,11 @@ export default async function main(client: Client) {
); );
// resolve the deployment, since we might have been given an alias // resolve the deployment, since we might have been given an alias
try { const deployment = await getDeployment(
deployment = await getDeployment(client, deploymentIdOrHost); client,
} catch (err: unknown) { contextName,
if (isAPIError(err)) { deploymentIdOrHost
if (err.status === 404) { );
error(
`Failed to find deployment "${deploymentIdOrHost}" in ${chalk.bold(
contextName
)}`
);
return 1;
}
if (err.status === 403) {
error(
`No permission to access deployment "${deploymentIdOrHost}" in ${chalk.bold(
contextName
)}`
);
return 1;
}
}
// unexpected
throw err;
}
const { const {
id, id,
@@ -138,11 +116,11 @@ export default async function main(client: Client) {
const { builds } = const { builds } =
deployment.version === 2 deployment.version === 2
? await client.fetch<{ builds: Build[] }>(`/v1/deployments/${id}/builds`) ? await client.fetch<{ builds: Build[] }>(`/v11/deployments/${id}/builds`)
: { builds: [] }; : { builds: [] };
log( log(
`Fetched deployment ${chalk.bold(url)} in ${chalk.bold( `Fetched deployment "${chalk.bold(url)}" in ${chalk.bold(
contextName contextName
)} ${elapsed(Date.now() - depFetchStart)}` )} ${elapsed(Date.now() - depFetchStart)}`
); );
@@ -163,7 +141,7 @@ export default async function main(client: Client) {
} }
print('\n\n'); print('\n\n');
if (aliases.length > 0) { if (aliases !== undefined && aliases.length > 0) {
print(chalk.bold(' Aliases\n\n')); print(chalk.bold(' Aliases\n\n'));
let aliasList = ''; let aliasList = '';
for (const alias of aliases) { for (const alias of aliases) {
@@ -202,8 +180,6 @@ function stateString(s: Deployment['readyState']) {
switch (s) { switch (s) {
case 'INITIALIZING': case 'INITIALIZING':
case 'BUILDING': case 'BUILDING':
case 'DEPLOYING':
case 'ANALYZING':
return chalk.yellow(CIRCLE) + sTitle; return chalk.yellow(CIRCLE) + sTitle;
case 'ERROR': case 'ERROR':
return chalk.red(CIRCLE) + sTitle; return chalk.red(CIRCLE) + sTitle;

View File

@@ -7,8 +7,7 @@ import getScope from '../util/get-scope';
import { getPkgName } from '../util/pkg-name'; import { getPkgName } from '../util/pkg-name';
import getArgs from '../util/get-args'; import getArgs from '../util/get-args';
import Client from '../util/client'; import Client from '../util/client';
import { getDeployment } from '../util/get-deployment'; import getDeployment from '../util/get-deployment';
import { isAPIError } from '../util/errors-ts';
const help = () => { const help = () => {
console.log(` console.log(`
@@ -125,28 +124,9 @@ export default async function main(client: Client) {
let deployment; let deployment;
try { try {
deployment = await getDeployment(client, id); deployment = await getDeployment(client, contextName, id);
} catch (err: unknown) { } finally {
output.stopSpinner(); output.stopSpinner();
if (isAPIError(err)) {
if (err.status === 404) {
output.error(
`Failed to find deployment "${id}" in ${chalk.bold(contextName)}`
);
return 1;
}
if (err.status === 403) {
output.error(
`No permission to access deployment "${id}" in ${chalk.bold(
contextName
)}`
);
return 1;
}
}
// unexpected
throw err;
} }
output.log( output.log(

View File

@@ -11,7 +11,7 @@ import getScope from '../util/get-scope';
import { isValidName } from '../util/is-valid-name'; import { isValidName } from '../util/is-valid-name';
import removeProject from '../util/projects/remove-project'; import removeProject from '../util/projects/remove-project';
import getProjectByIdOrName from '../util/projects/get-project-by-id-or-name'; import getProjectByIdOrName from '../util/projects/get-project-by-id-or-name';
import getDeploymentByIdOrHost from '../util/deploy/get-deployment-by-id-or-host'; import getDeployment from '../util/get-deployment';
import getDeploymentsByProjectId, { import getDeploymentsByProjectId, {
DeploymentPartial, DeploymentPartial,
} from '../util/deploy/get-deployments-by-project-id'; } from '../util/deploy/get-deployments-by-project-id';
@@ -133,7 +133,7 @@ export default async function main(client: Client) {
id => id =>
d && d &&
!(d instanceof NowError) && !(d instanceof NowError) &&
(d.uid === id || d.name === id || d.url === normalizeURL(id)) (d.id === id || d.name === id || d.url === normalizeURL(id))
); );
const [deploymentList, projectList] = await Promise.all<any>([ const [deploymentList, projectList] = await Promise.all<any>([
@@ -142,7 +142,7 @@ export default async function main(client: Client) {
if (!contextName) { if (!contextName) {
throw new Error('Context name is not defined'); throw new Error('Context name is not defined');
} }
return getDeploymentByIdOrHost(client, contextName, idOrHost); return getDeployment(client, contextName, idOrHost).catch(err => err);
}) })
), ),
Promise.all( Promise.all(
@@ -180,7 +180,7 @@ export default async function main(client: Client) {
aliases = await Promise.all( aliases = await Promise.all(
deployments.map(async depl => { deployments.map(async depl => {
const { aliases } = await getAliases(client, depl.uid); const { aliases } = await getAliases(client, depl.id);
return aliases; return aliases;
}) })
); );
@@ -238,7 +238,7 @@ export default async function main(client: Client) {
const start = Date.now(); const start = Date.now();
await Promise.all<any>([ await Promise.all<any>([
...deployments.map(depl => now.remove(depl.uid, { hard })), ...deployments.map(depl => now.remove(depl.id, { hard })),
...projects.map(project => removeProject(client, project.id)), ...projects.map(project => removeProject(client, project.id)),
]); ]);
@@ -275,9 +275,9 @@ function readConfirmation(
const deploymentTable = table( const deploymentTable = table(
deployments.map(depl => { deployments.map(depl => {
const time = chalk.gray(`${ms(Date.now() - depl.created)} ago`); const time = chalk.gray(`${ms(Date.now() - depl.createdAt)} ago`);
const url = depl.url ? chalk.underline(`https://${depl.url}`) : ''; const url = depl.url ? chalk.underline(`https://${depl.url}`) : '';
return [` ${depl.uid}`, url, time]; return [` ${depl.id}`, url, time];
}), }),
{ align: ['l', 'r', 'l'], hsep: ' '.repeat(6) } { align: ['l', 'r', 'l'], hsep: ' '.repeat(6) }
); );

View File

@@ -18,7 +18,7 @@ import sourceMap from '@zeit/source-map-support';
import { mkdirp } from 'fs-extra'; import { mkdirp } from 'fs-extra';
import chalk from 'chalk'; import chalk from 'chalk';
import epipebomb from 'epipebomb'; import epipebomb from 'epipebomb';
import updateNotifier from 'update-notifier'; import getLatestVersion from './util/get-latest-version';
import { URL } from 'url'; import { URL } from 'url';
import * as Sentry from '@sentry/node'; import * as Sentry from '@sentry/node';
import hp from './util/humanize-path'; import hp from './util/humanize-path';
@@ -55,13 +55,6 @@ import { VercelConfig } from '@vercel/client';
const isCanary = pkg.version.includes('canary'); const isCanary = pkg.version.includes('canary');
// Checks for available update and returns an instance
const notifier = updateNotifier({
pkg,
distTag: isCanary ? 'canary' : 'latest',
updateCheckInterval: 1000 * 60 * 60 * 24 * 7, // 1 week
});
const VERCEL_DIR = getGlobalPathConfig(); const VERCEL_DIR = getGlobalPathConfig();
const VERCEL_CONFIG_PATH = configFiles.getConfigFilePath(); const VERCEL_CONFIG_PATH = configFiles.getConfigFilePath();
const VERCEL_AUTH_CONFIG_PATH = configFiles.getAuthConfigFilePath(); const VERCEL_AUTH_CONFIG_PATH = configFiles.getAuthConfigFilePath();
@@ -149,22 +142,26 @@ const main = async () => {
} }
// Print update information, if available // Print update information, if available
if (notifier.update && notifier.update.latest !== pkg.version && isTTY) { if (isTTY && !process.env.NO_UPDATE_NOTIFIER) {
const { latest } = notifier.update; // Check if an update is available. If so, `latest` will contain a string
console.log( // of the latest version, otherwise `undefined`.
info( const latest = getLatestVersion({
distTag: isCanary ? 'canary' : 'latest',
output,
pkg,
});
if (latest) {
output.log(
`${chalk.black.bgCyan('UPDATE AVAILABLE')} ` + `${chalk.black.bgCyan('UPDATE AVAILABLE')} ` +
`Run ${cmd( `Run ${cmd(
await getUpdateCommand() await getUpdateCommand()
)} to install ${getTitleName()} CLI ${latest}` )} to install ${getTitleName()} CLI ${latest}`
) );
);
console.log( output.log(
info( `Changelog: https://github.com/vercel/vercel/releases/tag/vercel@${latest}\n`
`Changelog: https://github.com/vercel/vercel/releases/tag/vercel@${latest}` );
) }
);
} }
// The second argument to the command can be: // The second argument to the command can be:

View File

@@ -1,4 +1,6 @@
import type { BuilderFunctions } from '@vercel/build-utils';
import type { Readable, Writable } from 'stream'; import type { Readable, Writable } from 'stream';
import type { Route } from '@vercel/routing-utils';
export type ProjectSettings = import('@vercel/build-utils').ProjectSettings; export type ProjectSettings = import('@vercel/build-utils').ProjectSettings;
@@ -116,32 +118,105 @@ export type Cert = {
expiration: string; expiration: string;
}; };
type RouteOrMiddleware =
| Route
| {
src: string;
continue: boolean;
middleware: 0;
};
export type Deployment = { export type Deployment = {
uid: string; alias?: string[];
url: string; aliasAssigned?: boolean | null | number;
aliasError?: null | { code: string; message: string };
aliasFinal?: string | null;
aliasWarning?: null | {
code: string;
message: string;
link?: string;
action?: string;
};
bootedAt?: number;
build?: { env: string[] };
builds?: { use: string; src?: string; config?: { [key: string]: any } };
buildErrorAt?: number;
buildingAt: number;
canceledAt?: number;
checksState?: 'completed' | 'registered' | 'running';
checksConclusion?: 'canceled' | 'failed' | 'skipped' | 'succeeded';
createdAt: number;
createdIn?: string;
creator: { uid: string; username?: string };
env?: string[];
errorCode?: string;
errorLink?: string;
errorMessage?: string | null;
errorStep?: string;
functions?: BuilderFunctions | null;
gitSource?: {
org?: string;
owner?: string;
prId?: number | null;
projectId: number;
ref?: string | null;
repoId?: number;
repoUuid: string;
sha?: string;
slug?: string;
type: string;
workspaceUuid: string;
};
id: string;
initReadyAt?: number;
inspectorUrl?: string | null;
lambdas?: Build[];
meta?: {
[key: string]: string | undefined;
};
monorepoManager?: string | null;
name: string; name: string;
type: 'LAMBDAS'; ownerId?: string;
state: plan?: 'enterprise' | 'hobby' | 'oss' | 'pro';
previewCommentsEnabled?: boolean;
projectId?: string;
projectSettings?: {
buildCommand?: string | null;
devCommand?: string | null;
framework?: string;
installCommand?: string | null;
outputDirectory?: string | null;
};
public: boolean;
ready?: number;
readyState:
| 'BUILDING' | 'BUILDING'
| 'ERROR' | 'ERROR'
| 'INITIALIZING' | 'INITIALIZING'
| 'QUEUED' | 'QUEUED'
| 'READY' | 'READY'
| 'CANCELED'; | 'CANCELED';
version?: number; regions: string[];
created: number; routes?: RouteOrMiddleware[] | null;
createdAt: number; source?: 'cli' | 'git' | 'import' | 'import/repo' | 'clone/repo';
ready?: number; status:
buildingAt?: number; | 'BUILDING'
creator: { uid: string; username: string }; | 'ERROR'
target: string | null; | 'INITIALIZING'
ownerId: string; | 'QUEUED'
projectId: string; | 'READY'
inspectorUrl: string; | 'CANCELED';
meta: { target?: 'staging' | 'production' | null;
[key: string]: any; team?: {
id: string;
name: string;
slug: string;
}; };
alias?: string[]; ttyBuildLogs?: boolean;
type: 'LAMBDAS';
url: string;
userAliases?: string[];
version: 2;
}; };
export type Alias = { export type Alias = {

View File

@@ -1,4 +1,4 @@
import { Deployment } from '../../types'; import type { Deployment } from '../../types';
import { Output } from '../output'; import { Output } from '../output';
import Client from '../client'; import Client from '../client';
import createAlias from './create-alias'; import createAlias from './create-alias';

View File

@@ -1,4 +1,4 @@
import { Deployment } from '../../types'; import type { Deployment } from '../../types';
import { Output } from '../output'; import { Output } from '../output';
import * as ERRORS from '../errors-ts'; import * as ERRORS from '../errors-ts';
import Client from '../client'; import Client from '../client';
@@ -62,7 +62,7 @@ async function performCreateAlias(
) { ) {
try { try {
return await client.fetch<AliasRecord>( return await client.fetch<AliasRecord>(
`/now/deployments/${deployment.uid}/aliases`, `/now/deployments/${deployment.id}/aliases`,
{ {
method: 'POST', method: 'POST',
body: { alias }, body: { alias },
@@ -79,7 +79,7 @@ async function performCreateAlias(
if (err.code === 'deployment_not_found') { if (err.code === 'deployment_not_found') {
return new ERRORS.DeploymentNotFound({ return new ERRORS.DeploymentNotFound({
context: contextName, context: contextName,
id: deployment.uid, id: deployment.id,
}); });
} }
if (err.code === 'gone') { if (err.code === 'gone') {

View File

@@ -5,7 +5,7 @@ import { Output } from '../output';
import { User } from '../../types'; import { User } from '../../types';
import { VercelConfig } from '../dev/types'; import { VercelConfig } from '../dev/types';
import getDeploymentsByAppName from '../deploy/get-deployments-by-appname'; import getDeploymentsByAppName from '../deploy/get-deployments-by-appname';
import getDeploymentByIdOrHost from '../deploy/get-deployment-by-id-or-host'; import getDeployment from '../get-deployment';
async function getAppLastDeployment( async function getAppLastDeployment(
output: Output, output: Output,
@@ -22,7 +22,7 @@ async function getAppLastDeployment(
// Try to fetch deployment details // Try to fetch deployment details
if (deploymentItem) { if (deploymentItem) {
return getDeploymentByIdOrHost(client, contextName, deploymentItem.uid); return await getDeployment(client, contextName, deploymentItem.uid);
} }
return null; return null;
@@ -42,13 +42,11 @@ export async function getDeploymentForAlias(
// When there are no args at all we try to get the targets from the config // When there are no args at all we try to get the targets from the config
if (args.length === 2) { if (args.length === 2) {
const [deploymentId] = args; const [deploymentId] = args;
const deployment = await getDeploymentByIdOrHost( try {
client, return await getDeployment(client, contextName, deploymentId);
contextName, } finally {
deploymentId output.stopSpinner();
); }
output.stopSpinner();
return deployment;
} }
const appName = const appName =
@@ -59,13 +57,15 @@ export async function getDeploymentForAlias(
return null; return null;
} }
const deployment = await getAppLastDeployment( try {
output, return await getAppLastDeployment(
client, output,
appName, client,
user, appName,
contextName user,
); contextName
output.stopSpinner(); );
return deployment; } finally {
output.stopSpinner();
}
} }

View File

@@ -5,18 +5,14 @@ import { satisfies } from 'semver';
import { dirname, join } from 'path'; import { dirname, join } from 'path';
import { mkdirp, outputJSON, readJSON, symlink } from 'fs-extra'; import { mkdirp, outputJSON, readJSON, symlink } from 'fs-extra';
import { isStaticRuntime } from '@vercel/fs-detectors'; import { isStaticRuntime } from '@vercel/fs-detectors';
import { import { BuilderV2, BuilderV3, PackageJson } from '@vercel/build-utils';
BuilderV2, import execa from 'execa';
BuilderV3,
PackageJson,
spawnAsync,
} from '@vercel/build-utils';
import * as staticBuilder from './static-builder'; import * as staticBuilder from './static-builder';
import { VERCEL_DIR } from '../projects/link'; import { VERCEL_DIR } from '../projects/link';
import { Output } from '../output'; import { Output } from '../output';
import readJSONFile from '../read-json-file'; import readJSONFile from '../read-json-file';
import { CantParseJSONFile } from '../errors-ts'; import { CantParseJSONFile } from '../errors-ts';
import { errorToString, isErrnoException, isError } from '@vercel/error-utils'; import { isErrnoException, isError } from '@vercel/error-utils';
import cmd from '../output/cmd'; import cmd from '../output/cmd';
import code from '../output/code'; import code from '../output/code';
@@ -213,32 +209,44 @@ async function installBuilders(
).join(', ')}` ).join(', ')}`
); );
try { try {
await spawnAsync( const { stderr } = await execa(
'npm', 'npm',
['install', '@vercel/build-utils', ...buildersToAdd], ['install', '@vercel/build-utils', ...buildersToAdd],
{ {
cwd: buildersDir, cwd: buildersDir,
stdio: 'pipe', stdio: 'pipe',
reject: true,
} }
); );
stderr
.split('/\r?\n/')
.filter(line => line.includes('npm WARN deprecated'))
.forEach(line => {
output.warn(line);
});
} catch (err: unknown) { } catch (err: unknown) {
if (isError(err)) { if (isError(err)) {
(err as any).link = const execaMessage = err.message;
'https://vercel.link/builder-dependencies-install-failed'; let message =
if (isErrnoException(err) && err.code === 'ENOENT') { err && 'stderr' in err && typeof err.stderr === 'string'
? err.stderr
: execaMessage;
if (execaMessage.startsWith('Command failed with ENOENT')) {
// `npm` is not installed // `npm` is not installed
err.message = `Please install ${cmd('npm')} before continuing`; message = `Please install ${cmd('npm')} before continuing`;
} else { } else {
const message = errorToString(err);
const notFound = /GET (.*) - Not found/.exec(message); const notFound = /GET (.*) - Not found/.exec(message);
if (notFound) { if (notFound) {
const url = new URL(notFound[1]); const url = new URL(notFound[1]);
const packageName = decodeURIComponent(url.pathname.slice(1)); const packageName = decodeURIComponent(url.pathname.slice(1));
err.message = `The package ${code( message = `The package ${code(
packageName packageName
)} is not published on the npm registry`; )} is not published on the npm registry`;
} }
} }
err.message = message;
(err as any).link =
'https://vercel.link/builder-dependencies-install-failed';
} }
throw err; throw err;
} }

View File

@@ -30,7 +30,7 @@ import {
import pipe from 'promisepipe'; import pipe from 'promisepipe';
import { unzip } from './unzip'; import { unzip } from './unzip';
import { VERCEL_DIR } from '../projects/link'; import { VERCEL_DIR } from '../projects/link';
import { VercelConfig } from '@vercel/client'; import { fileNameSymbol, VercelConfig } from '@vercel/client';
const { normalize } = posix; const { normalize } = posix;
export const OUTPUT_DIR = join(VERCEL_DIR, 'output'); export const OUTPUT_DIR = join(VERCEL_DIR, 'output');
@@ -56,6 +56,7 @@ export async function writeBuildResult(
return writeBuildResultV2( return writeBuildResultV2(
outputDir, outputDir,
buildResult as BuildResultV2, buildResult as BuildResultV2,
build,
vercelConfig vercelConfig
); );
} else if (version === 3) { } else if (version === 3) {
@@ -107,6 +108,7 @@ function stripDuplicateSlashes(path: string): string {
async function writeBuildResultV2( async function writeBuildResultV2(
outputDir: string, outputDir: string,
buildResult: BuildResultV2, buildResult: BuildResultV2,
build: Builder,
vercelConfig: VercelConfig | null vercelConfig: VercelConfig | null
) { ) {
if ('buildOutputPath' in buildResult) { if ('buildOutputPath' in buildResult) {
@@ -114,6 +116,18 @@ async function writeBuildResultV2(
return; return;
} }
// Some very old `@now` scoped Builders return `output` at the top-level.
// These Builders are no longer supported.
if (!buildResult.output) {
const configFile = vercelConfig?.[fileNameSymbol];
const updateMessage = build.use.startsWith('@now/')
? ` Please update from "@now" to "@vercel" in your \`${configFile}\` file.`
: '';
throw new Error(
`The build result from "${build.use}" is missing the "output" property.${updateMessage}`
);
}
const lambdas = new Map<Lambda, string>(); const lambdas = new Map<Lambda, string>();
const overrides: Record<string, PathOverride> = {}; const overrides: Record<string, PathOverride> = {};
for (const [path, output] of Object.entries(buildResult.output)) { for (const [path, output] of Object.entries(buildResult.output)) {

View File

@@ -1,78 +0,0 @@
import type Client from '../client';
import toHost from '../to-host';
import { Deployment } from '../../types';
import {
DeploymentNotFound,
DeploymentPermissionDenied,
InvalidDeploymentId,
isAPIError,
} from '../errors-ts';
import mapCertError from '../certs/map-cert-error';
type APIVersion = 'v5' | 'v10';
export default async function getDeploymentByIdOrHost(
client: Client,
contextName: string,
idOrHost: string,
apiVersion: APIVersion = 'v5'
) {
try {
const { deployment } =
idOrHost.indexOf('.') !== -1
? await getDeploymentByHost(
client,
toHost(idOrHost) as string,
apiVersion
)
: await getDeploymentById(client, idOrHost, apiVersion);
return deployment;
} catch (err: unknown) {
if (isAPIError(err)) {
if (err.status === 404) {
return new DeploymentNotFound({ id: idOrHost, context: contextName });
}
if (err.status === 403) {
return new DeploymentPermissionDenied(idOrHost, contextName);
}
if (err.status === 400 && err.message.includes('`id`')) {
return new InvalidDeploymentId(idOrHost);
}
const certError = mapCertError(err);
if (certError) {
return certError;
}
}
throw err;
}
}
async function getDeploymentById(
client: Client,
id: string,
apiVersion: APIVersion
) {
const deployment = await client.fetch<Deployment>(
`/${apiVersion}/now/deployments/${encodeURIComponent(id)}`
);
return { deployment };
}
type Response = {
id: string;
};
async function getDeploymentByHost(
client: Client,
host: string,
apiVersion: APIVersion
) {
const response = await client.fetch<Response>(
`/v10/now/deployments/get?url=${encodeURIComponent(
host
)}&resolve=1&noState=1`
);
return getDeploymentById(client, response.id, apiVersion);
}

View File

@@ -1,19 +0,0 @@
import { NowError } from '../now-error';
import Client from '../client';
import getDeploymentByIdOrHost from './get-deployment-by-id-or-host';
export default async function getDeploymentByIdOrThrow(
client: Client,
contextName: string,
idOrHost: string
) {
const deployment = await getDeploymentByIdOrHost(
client,
contextName,
idOrHost
);
if (deployment instanceof NowError) {
throw deployment;
}
return deployment;
}

View File

@@ -6,7 +6,7 @@ type Response = {
deployments: DeploymentPartial[]; deployments: DeploymentPartial[];
}; };
export interface DeploymentPartial { export interface DeploymentPartial {
uid: string; id: string;
name: string; name: string;
url: string; url: string;
created: number; created: number;

View File

@@ -7,7 +7,8 @@ import jsonlines from 'jsonlines';
import { eraseLines } from 'ansi-escapes'; import { eraseLines } from 'ansi-escapes';
import Client from './client'; import Client from './client';
import { getDeployment } from './get-deployment'; import getDeployment from './get-deployment';
import getScope from './get-scope';
export interface FindOpts { export interface FindOpts {
direction: 'forward' | 'backward'; direction: 'forward' | 'backward';
@@ -37,6 +38,7 @@ async function printEvents(
{ mode, onEvent, quiet, findOpts }: PrintEventsOptions { mode, onEvent, quiet, findOpts }: PrintEventsOptions
) { ) {
const { log, debug } = client.output; const { log, debug } = client.output;
const { contextName } = await getScope(client);
// we keep track of how much we log in case we // we keep track of how much we log in case we
// drop the connection and have to start over // drop the connection and have to start over
@@ -74,7 +76,11 @@ async function printEvents(
poller = (function startPoller() { poller = (function startPoller() {
return setTimeout(async () => { return setTimeout(async () => {
try { try {
const json = await getDeployment(client, deploymentIdOrURL); const json = await getDeployment(
client,
contextName,
deploymentIdOrURL
);
if (json.readyState === 'READY') { if (json.readyState === 'READY') {
stream.end(); stream.end();
finish(); finish();

View File

@@ -1,27 +1,53 @@
import { stringify } from 'querystring'; import type Client from './client';
import { Deployment } from '@vercel/client'; import {
import Client from './client'; DeploymentNotFound,
DeploymentPermissionDenied,
InvalidDeploymentId,
isAPIError,
} from './errors-ts';
import type { Deployment } from '../types';
import mapCertError from './certs/map-cert-error';
import toHost from './to-host';
export async function getDeployment( /**
* Retrieves a v13 deployment.
*
* @param client - The Vercel CLI client instance.
* @param contextName - The scope context/team name.
* @param hostOrId - A deployment host or id.
* @returns The deployment information.
*/
export default async function getDeployment(
client: Client, client: Client,
contextName: string,
hostOrId: string hostOrId: string
): Promise<Deployment> { ): Promise<Deployment> {
let url = `/v13/deployments`;
if (hostOrId.includes('.')) { if (hostOrId.includes('.')) {
let host = hostOrId.replace(/^https:\/\//i, ''); hostOrId = toHost(hostOrId);
if (host.slice(-1) === '/') {
host = host.slice(0, -1);
}
url += `/get?${stringify({
url: host,
})}`;
} else {
url += `/${encodeURIComponent(hostOrId)}`;
} }
const deployment = await client.fetch<Deployment>(url); try {
return deployment; return await client.fetch<Deployment>(
`/v13/deployments/${encodeURIComponent(hostOrId)}`
);
} catch (err: unknown) {
if (isAPIError(err)) {
if (err.status === 404) {
throw new DeploymentNotFound({ id: hostOrId, context: contextName });
}
if (err.status === 403) {
throw new DeploymentPermissionDenied(hostOrId, contextName);
}
if (err.status === 400 && err.message.includes('`id`')) {
throw new InvalidDeploymentId(hostOrId);
}
const certError = mapCertError(err);
if (certError) {
throw certError;
}
}
throw err;
}
} }

View File

@@ -0,0 +1,225 @@
/**
* This file is spawned in the background and checks npm for the latest version
* of the CLI, then writes the version to the cache file.
*
* NOTE: Since this file runs asynchronously in the background, it's possible
* for multiple instances of this file to be running at the same time leading
* to a race condition where the most recent instance will overwrite the
* previous cache file resetting the `notified` flag and cause the update
* notification to appear for multiple consequetive commands. Not the end of
* the world, but something to be aware of.
*
* IMPORTANT! This file must NOT depend on any 3rd party dependencies. This
* file is NOT bundled by `ncc` and thus any 3rd party dependencies will never
* be available.
*/
const https = require('https');
const { mkdirSync, writeFileSync } = require('fs');
const { access, mkdir, readFile, unlink, writeFile } = require('fs/promises');
const path = require('path');
const { format, inspect } = require('util');
/**
* An simple output helper which accumulates error and debug log messages in
* memory for potential persistance to disk while immediately outputting errors
* and debug messages, when the `--debug` flag is set, to `stderr`.
*/
class WorkerOutput {
debugLog = [];
logFile = null;
constructor({ debug = true }) {
this.debugOutputEnabled = debug;
}
debug(...args) {
this.print('debug', args);
}
error(...args) {
this.print('error', args);
}
print(type, args) {
// note: `args` may contain an `Error` that will be toString()'d and thus
// no stack trace
const str = format(
...args.map(s => (typeof s === 'string' ? s : inspect(s)))
);
this.debugLog.push(`[${new Date().toISOString()}] [${type}] ${str}`);
if (type === 'debug' && this.debugOutputEnabled) {
console.error(`> '[debug] [${new Date().toISOString()}] ${str}`);
} else if (type === 'error') {
console.error(`Error: ${str}`);
}
}
setLogFile(file) {
// wire up the exit handler the first time the log file is set
if (this.logFile === null) {
process.on('exit', () => {
if (this.debugLog.length) {
mkdirSync(path.dirname(this.logFile), { recursive: true });
writeFileSync(this.logFile, this.debugLog.join('\n'));
}
});
}
this.logFile = file;
}
}
const output = new WorkerOutput({
// enable the debug logging if the `--debug` is set or if this worker script
// was directly executed
debug: process.argv.includes('--debug') || !process.connected,
});
process.on('unhandledRejection', err => {
output.error('Exiting worker due to unhandled rejection:', err);
process.exit(1);
});
// this timer will prevent this worker process from running longer than 10s
const timer = setTimeout(() => {
output.error('Worker timed out after 10 seconds');
process.exit(1);
}, 10000);
// wait for the parent to give us the work payload
process.once('message', async msg => {
output.debug('Received message from parent:', msg);
output.debug('Disconnecting from parent');
process.disconnect();
const { cacheFile, distTag, name, updateCheckInterval } = msg;
const cacheFileParsed = path.parse(cacheFile);
await mkdir(cacheFileParsed.dir, { recursive: true });
output.setLogFile(
path.join(cacheFileParsed.dir, `${cacheFileParsed.name}.log`)
);
const lockFile = path.join(
cacheFileParsed.dir,
`${cacheFileParsed.name}.lock`
);
try {
// check for a lock file and either bail if running or write our pid and continue
output.debug(`Checking lock file: ${lockFile}`);
if (await isRunning(lockFile)) {
output.debug('Worker already running, exiting');
process.exit(1);
}
output.debug(`Initializing lock file with pid ${process.pid}`);
await writeFile(lockFile, String(process.pid), 'utf-8');
// fetch the latest version from npm
const agent = new https.Agent({
keepAlive: true,
maxSockets: 15, // See: `npm config get maxsockets`
});
const headers = {
accept:
'application/vnd.npm.install-v1+json; q=1.0, application/json; q=0.8, */*',
};
const url = `https://registry.npmjs.org/-/package/${name}/dist-tags`;
output.debug(`Fetching ${url}`);
const tags = await new Promise((resolve, reject) => {
const req = https.get(
url,
{
agent,
headers,
},
res => {
let buf = '';
res.on('data', chunk => {
buf += chunk;
});
res.on('end', () => {
try {
resolve(JSON.parse(buf));
} catch (err) {
reject(err);
}
});
}
);
req.on('error', reject);
req.end();
});
const version = tags[distTag];
if (version) {
output.debug(`Found dist tag "${distTag}" with version "${version}"`);
} else {
output.error(`Dist tag "${distTag}" not found`);
output.debug('Available dist tags:', Object.keys(tags));
}
output.debug(`Writing cache file: ${cacheFile}`);
await writeFile(
cacheFile,
JSON.stringify({
expireAt: Date.now() + updateCheckInterval,
notified: false,
version,
})
);
} catch (err) {
output.error(`Failed to get package info:`, err);
} finally {
clearTimeout(timer);
if (await fileExists(lockFile)) {
output.debug(`Releasing lock file: ${lockFile}`);
await unlink(lockFile);
}
output.debug(`Worker finished successfully!`);
// force the worker to exit
process.exit(0);
}
});
// signal the parent process we're ready
if (process.connected) {
output.debug("Notifying parent we're ready");
process.send({ type: 'ready' });
} else {
console.error('No IPC bridge detected, exiting');
process.exit(1);
}
async function fileExists(file) {
return access(file)
.then(() => true)
.catch(() => false);
}
async function isRunning(lockFile) {
try {
const pid = parseInt(await readFile(lockFile, 'utf-8'));
output.debug(`Found lock file with pid: ${pid}`);
// checks for existence of a process; throws if not found
process.kill(pid, 0);
// process is still running
return true;
} catch (err) {
if (await fileExists(lockFile)) {
// lock file does not exist or process is not running and pid is stale
output.debug(`Resetting lock file: ${err.toString()}`);
await unlink(lockFile);
}
return false;
}
}

View File

@@ -0,0 +1,151 @@
import semver from 'semver';
import XDGAppPaths from 'xdg-app-paths';
import { dirname, parse as parsePath, resolve as resolvePath } from 'path';
import type { Output } from '../output';
import { existsSync, outputJSONSync, readJSONSync } from 'fs-extra';
import type { PackageJson } from '@vercel/build-utils';
import { spawn } from 'child_process';
interface GetLatestVersionOptions {
cacheDir?: string;
distTag?: string;
output?: Output;
pkg: PackageJson;
updateCheckInterval?: number;
}
interface PackageInfoCache {
version: string;
expireAt: number;
notified: boolean;
}
interface GetLatestWorkerPayload {
cacheFile?: string;
distTag?: string;
updateCheckInterval?: number;
name?: string;
}
/**
* Determines if it needs to check for a newer CLI version and returns the last
* detected version. The version could be stale, but still newer than the
* current version.
*
* @returns {String|undefined} If a newer version is found, then the lastest
* version, otherwise `undefined`.
*/
export default function getLatestVersion({
cacheDir = XDGAppPaths('com.vercel.cli').cache(),
distTag = 'latest',
output,
pkg,
updateCheckInterval = 1000 * 60 * 60 * 24 * 7, // 1 week
}: GetLatestVersionOptions): string | undefined {
if (
!pkg ||
typeof pkg !== 'object' ||
!pkg.name ||
typeof pkg.name !== 'string'
) {
throw new TypeError('Expected package to be an object with a package name');
}
const cacheFile = resolvePath(
cacheDir,
'package-updates',
`${pkg.name}-${distTag}.json`
);
let cache: PackageInfoCache | undefined;
try {
cache = readJSONSync(cacheFile);
} catch (err: any) {
// cache does not exist or malformed
if (err.code !== 'ENOENT') {
output?.debug(`Error reading latest package cache file: ${err}`);
}
}
if (!cache || cache.expireAt < Date.now()) {
spawnWorker(
{
cacheFile,
distTag,
updateCheckInterval,
name: pkg.name,
},
output
);
}
if (
cache &&
!cache.notified &&
pkg.version &&
semver.lt(pkg.version, cache.version)
) {
cache.notified = true;
outputJSONSync(cacheFile, cache);
return cache.version;
}
}
/**
* Spawn the worker, wait for the worker to report it's ready, then signal the
* worker to fetch the latest version.
*/
function spawnWorker(
payload: GetLatestWorkerPayload,
output: Output | undefined
) {
// we need to find the update worker script since the location is
// different based on production vs tests
let dir = dirname(__filename);
let script = resolvePath(dir, 'dist', 'get-latest-worker.js');
const { root } = parsePath(dir);
while (!existsSync(script)) {
dir = dirname(dir);
if (dir === root) {
// didn't find it, bail
output?.debug('Failed to find the get latest worker script!');
return;
}
script = resolvePath(dir, 'dist', 'get-latest-worker.js');
}
// spawn the worker with an IPC channel
output?.debug(`Spawning ${script}`);
const args = [script];
if (output?.debugEnabled) {
args.push('--debug');
}
const worker = spawn(process.execPath, args, {
stdio: ['inherit', 'inherit', 'inherit', 'ipc'],
windowsHide: true,
});
// we allow the child 2 seconds to let us know it's ready before we give up
const workerReadyTimer = setTimeout(() => worker.kill(), 2000);
// listen for an early on close error, but then we remove it when unref
const onClose = (code: number) => {
output?.debug(`Get latest worker exited (code ${code})`);
};
worker.on('close', onClose);
// generally, the parent won't be around long enough to handle a non-zero
// worker process exit code
worker.on('error', err => {
output?.log(`Failed to spawn get latest worker: ${err.stack}`);
});
// wait for the worker to start and notify us it is ready
worker.once('message', () => {
clearTimeout(workerReadyTimer);
worker.removeListener('close', onClose);
worker.send(payload);
worker.unref();
});
}

View File

@@ -1,5 +1,4 @@
import Client from '../client'; import Client from '../client';
import { stringify } from 'qs';
import { Org } from '../../types'; import { Org } from '../../types';
import chalk from 'chalk'; import chalk from 'chalk';
import link from '../output/link'; import link from '../output/link';
@@ -19,9 +18,7 @@ export async function disconnectGitProvider(
org: Org, org: Org,
projectId: string projectId: string
) { ) {
const fetchUrl = `/v9/projects/${projectId}/link?${stringify({ const fetchUrl = `/v9/projects/${projectId}/link`;
teamId: org.type === 'team' ? org.id : undefined,
})}`;
return client.fetch(fetchUrl, { return client.fetch(fetchUrl, {
method: 'DELETE', method: 'DELETE',
headers: { headers: {
@@ -37,9 +34,7 @@ export async function connectGitProvider(
type: string, type: string,
repo: string repo: string
) { ) {
const fetchUrl = `/v9/projects/${projectId}/link?${stringify({ const fetchUrl = `/v9/projects/${projectId}/link`;
teamId: org.type === 'team' ? org.id : undefined,
})}`;
try { try {
return await client.fetch(fetchUrl, { return await client.fetch(fetchUrl, {
method: 'POST', method: 'POST',

View File

@@ -1,38 +0,0 @@
import type Client from '../client';
import type { Deployment } from '../../types';
import getDeploymentByIdOrHost from '../deploy/get-deployment-by-id-or-host';
import handleCertError from '../certs/handle-cert-error';
/**
* Attempts to find the deployment by name or id.
* @param {Client} client - The Vercel client instance
* @param {string} contextName - The scope name
* @param {string} deployId - The deployment name or id to rollback
* @returns {Promise<Deployment>} Resolves an exit code or deployment info
*/
export default async function getDeploymentInfo(
client: Client,
contextName: string,
deployId: string
): Promise<Deployment> {
const deployment = handleCertError(
client.output,
await getDeploymentByIdOrHost(client, contextName, deployId)
);
if (deployment === 1) {
throw new Error(
`Failed to get deployment "${deployId}" in scope "${contextName}"`
);
}
if (deployment instanceof Error) {
throw deployment;
}
if (!deployment) {
throw new Error(`Couldn't find the deployment "${deployId}"`);
}
return deployment;
}

View File

@@ -1,11 +1,12 @@
import chalk from 'chalk'; import chalk from 'chalk';
import type Client from '../client'; import type Client from '../client';
import type { Deployment, Project, Team } from '../../types';
import { getCommandName } from '../pkg-name'; import { getCommandName } from '../pkg-name';
import getDeploymentInfo from './get-deployment-info'; import getDeployment from '../get-deployment';
import getScope from '../get-scope'; import getScope from '../get-scope';
import getTeamById from '../teams/get-team-by-id';
import { isValidName } from '../is-valid-name'; import { isValidName } from '../is-valid-name';
import ms from 'ms'; import ms from 'ms';
import type { Project } from '../../types';
import rollbackStatus from './status'; import rollbackStatus from './status';
/** /**
@@ -27,7 +28,7 @@ export default async function requestRollback({
project: Project; project: Project;
timeout?: string; timeout?: string;
}): Promise<number> { }): Promise<number> {
const { output } = client; const { config, output } = client;
const { contextName } = await getScope(client); const { contextName } = await getScope(client);
if (!isValidName(deployId)) { if (!isValidName(deployId)) {
@@ -37,27 +38,65 @@ export default async function requestRollback({
return 1; return 1;
} }
output.spinner( let deployment: Deployment;
`Fetching deployment "${deployId}" in ${chalk.bold(contextName)}` let team: Team | undefined;
);
let deployment;
try { try {
deployment = await getDeploymentInfo(client, contextName, deployId); output.spinner(
} catch (err: any) { `Fetching deployment "${deployId}" in ${chalk.bold(contextName)}`
output.error(err?.toString() || err); );
return 1;
} finally { const [teamResult, deploymentResult] = await Promise.allSettled([
output.stopSpinner(); config.currentTeam ? getTeamById(client, config.currentTeam) : undefined,
getDeployment(client, contextName, deployId),
]);
if (teamResult.status === 'rejected') {
output.error(`Failed to retrieve team information: ${teamResult.reason}`);
return 1;
}
if (deploymentResult.status === 'rejected') {
output.error(deploymentResult.reason);
return 1;
}
team = teamResult.value;
deployment = deploymentResult.value;
// re-render the spinner text because it goes so fast // re-render the spinner text because it goes so fast
output.log( output.log(
`Fetching deployment "${deployId}" in ${chalk.bold(contextName)}` `Fetching deployment "${deployId}" in ${chalk.bold(contextName)}`
); );
} finally {
output.stopSpinner();
}
if (deployment.team?.id) {
if (!team || deployment.team.id !== team.id) {
output.error(
team
? `Deployment doesn't belong to current team ${chalk.bold(
contextName
)}`
: `Deployment belongs to a different team`
);
output.error(
`Use ${chalk.bold('vc switch')} to change your current team`
);
return 1;
}
} else if (team) {
output.error(
`Deployment doesn't belong to current team ${chalk.bold(contextName)}`
);
output.error(`Use ${chalk.bold('vc switch')} to change your current team`);
return 1;
} }
// create the rollback // create the rollback
await client.fetch<any>( await client.fetch<any>(
`/v9/projects/${project.id}/rollback/${deployment.uid}`, `/v9/projects/${project.id}/rollback/${deployment.id}`,
{ {
body: {}, // required body: {}, // required
method: 'POST', method: 'POST',
@@ -68,7 +107,7 @@ export default async function requestRollback({
output.log( output.log(
`Successfully requested rollback of ${chalk.bold(project.name)} to ${ `Successfully requested rollback of ${chalk.bold(project.name)} to ${
deployment.url deployment.url
} (${deployment.uid})` } (${deployment.id})`
); );
output.log(`To check rollback status, run ${getCommandName('rollback')}.`); output.log(`To check rollback status, run ${getCommandName('rollback')}.`);
return 0; return 0;

View File

@@ -8,7 +8,7 @@ import type {
} from '../../types'; } from '../../types';
import elapsed from '../output/elapsed'; import elapsed from '../output/elapsed';
import formatDate from '../format-date'; import formatDate from '../format-date';
import getDeploymentInfo from './get-deployment-info'; import getDeployment from '../get-deployment';
import getScope from '../get-scope'; import getScope from '../get-scope';
import ms from 'ms'; import ms from 'ms';
import renderAliasStatus from './render-alias-status'; import renderAliasStatus from './render-alias-status';
@@ -168,8 +168,7 @@ async function renderJobFailed({
try { try {
const name = ( const name = (
deployment || deployment || (await getDeployment(client, contextName, toDeploymentId))
(await getDeploymentInfo(client, contextName, toDeploymentId))
)?.url; )?.url;
output.error( output.error(
`Failed to remap all aliases to the requested deployment ${name} (${toDeploymentId})` `Failed to remap all aliases to the requested deployment ${name} (${toDeploymentId})`
@@ -228,13 +227,10 @@ async function renderJobSucceeded({
}) { }) {
const { output } = client; const { output } = client;
// attempt to get the new deployment url
let deploymentInfo = ''; let deploymentInfo = '';
try { try {
const deployment = await getDeploymentInfo( const deployment = await getDeployment(client, contextName, toDeploymentId);
client,
contextName,
toDeploymentId
);
deploymentInfo = `${chalk.bold(deployment.url)} (${toDeploymentId})`; deploymentInfo = `${chalk.bold(deployment.url)} (${toDeploymentId})`;
} catch (err: any) { } catch (err: any) {
output.debug( output.debug(

View File

@@ -1,20 +1,8 @@
// Native
import { parse } from 'url';
/** /**
* Converts a valid deployment lookup parameter to a hostname. * Converts a valid deployment lookup parameter to a hostname.
* `http://google.com` => google.com * `http://google.com` => google.com
* google.com => google.com * google.com => google.com
*/ */
export default function toHost(url: string): string {
function toHost(url: string): string { return url.replace(/^(?:.*?\/\/)?([^/]+).*/, '$1');
if (/^https?:\/\//.test(url)) {
return parse(url).host!;
}
// Remove any path if present
// `a.b.c/` => `a.b.c`
return url.replace(/(\/\/)?([^/]+)(.*)/, '$2');
} }
export default toHost;

View File

@@ -1,2 +0,0 @@
README.md
yarn.lock

View File

@@ -1,16 +1,5 @@
import { join } from 'path'; import { isIP } from 'net';
import ms from 'ms'; const { exec, fixture, testFixture, testFixtureStdio } = require('./utils.js');
import fs, { mkdirp } from 'fs-extra';
const {
exec,
fetch,
fixture,
sleep,
testFixture,
testFixtureStdio,
validateResponseHeaders,
} = require('./utils.js');
test('[vercel dev] validate redirects', async () => { test('[vercel dev] validate redirects', async () => {
const directory = fixture('invalid-redirects'); const directory = fixture('invalid-redirects');
@@ -124,260 +113,112 @@ test(
); );
test( test(
'[vercel dev] test cleanUrls serve correct content', '[vercel dev] Use `@vercel/python` with Flask requirements.txt',
testFixtureStdio('test-clean-urls', async (testPath: any) => { testFixtureStdio('python-flask', async (testPath: any) => {
await testPath(200, '/', 'Index Page'); const name = 'Alice';
await testPath(200, '/about', 'About Page'); const year = new Date().getFullYear();
await testPath(200, '/sub', 'Sub Index Page'); await testPath(200, `/api/user?name=${name}`, new RegExp(`Hello ${name}`));
await testPath(200, '/sub/another', 'Sub Another Page'); await testPath(200, `/api/date`, new RegExp(`Current date is ${year}`));
await testPath(200, '/style.css', 'body { color: green }'); await testPath(200, `/api/date.py`, new RegExp(`Current date is ${year}`));
await testPath(308, '/index.html', 'Redirecting to / (308)', { await testPath(200, `/api/headers`, (body: any, res: any) => {
Location: '/', // @ts-ignore
}); const { host } = new URL(res.url);
await testPath(308, '/about.html', 'Redirecting to /about (308)', { expect(body).toBe(host);
Location: '/about',
});
await testPath(308, '/sub/index.html', 'Redirecting to /sub (308)', {
Location: '/sub',
}); });
})
);
test(
'[vercel dev] Use custom runtime from the "functions" property',
testFixtureStdio('custom-runtime', async (testPath: any) => {
await testPath(200, `/api/user`, /Hello, from Bash!/m);
await testPath(200, `/api/user.sh`, /Hello, from Bash!/m);
})
);
test(
'[vercel dev] Should work with nested `tsconfig.json` files',
testFixtureStdio('nested-tsconfig', async (testPath: any) => {
await testPath(200, `/`, /Nested tsconfig.json test page/);
await testPath(200, `/api`, 'Nested `tsconfig.json` API endpoint');
})
);
test(
'[vercel dev] Should force `tsc` option "module: commonjs" for `startDevServer()`',
testFixtureStdio('force-module-commonjs', async (testPath: any) => {
await testPath(200, `/`, /Force &quot;module: commonjs&quot; test page/);
await testPath( await testPath(
308, 200,
'/sub/another.html', `/api`,
'Redirecting to /sub/another (308)', 'Force "module: commonjs" JavaScript with ES Modules API endpoint'
{ Location: '/sub/another' } );
await testPath(
200,
`/api/ts`,
'Force "module: commonjs" TypeScript API endpoint'
); );
}) })
); );
test( test(
'[vercel dev] test cleanUrls serve correct content when using `outputDirectory`', '[vercel dev] should prioritize index.html over other file named index.*',
testFixtureStdio( testFixtureStdio('index-html-priority', async (testPath: any) => {
'test-clean-urls-with-output-directory', await testPath(200, '/', 'This is index.html');
async (testPath: any) => { await testPath(200, '/index.css', 'This is index.css');
await testPath(200, '/', 'Index Page');
await testPath(200, '/about', 'About Page');
await testPath(200, '/sub', 'Sub Index Page');
await testPath(200, '/sub/another', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
await testPath(308, '/index.html', 'Redirecting to / (308)', {
Location: '/',
});
await testPath(308, '/about.html', 'Redirecting to /about (308)', {
Location: '/about',
});
await testPath(308, '/sub/index.html', 'Redirecting to /sub (308)', {
Location: '/sub',
});
await testPath(
308,
'/sub/another.html',
'Redirecting to /sub/another (308)',
{ Location: '/sub/another' }
);
}
)
);
test(
'[vercel dev] should serve custom 404 when `cleanUrls: true`',
testFixtureStdio('test-clean-urls-custom-404', async (testPath: any) => {
await testPath(200, '/', 'This is the home page');
await testPath(200, '/about', 'The about page');
await testPath(200, '/contact/me', 'Contact Me Subdirectory');
await testPath(404, '/nothing', 'Custom 404 Page');
await testPath(404, '/nothing/', 'Custom 404 Page');
}) })
); );
test( test(
'[vercel dev] test cleanUrls and trailingSlash serve correct content', '[vercel dev] Should support `*.go` API serverless functions',
testFixtureStdio('test-clean-urls-trailing-slash', async (testPath: any) => { testFixtureStdio('go', async (testPath: any) => {
await testPath(200, '/', 'Index Page'); await testPath(200, `/api`, 'This is the index page');
await testPath(200, '/about/', 'About Page'); await testPath(200, `/api/index`, 'This is the index page');
await testPath(200, '/sub/', 'Sub Index Page'); await testPath(200, `/api/index.go`, 'This is the index page');
await testPath(200, '/sub/another/', 'Sub Another Page'); await testPath(200, `/api/another`, 'This is another page');
await testPath(200, '/style.css', 'body { color: green }'); await testPath(200, '/api/another.go', 'This is another page');
//TODO: fix this test so that location is `/` instead of `//` await testPath(200, `/api/foo`, 'Req Path: /api/foo');
//await testPath(308, '/index.html', 'Redirecting to / (308)', { Location: '/' }); await testPath(200, `/api/bar`, 'Req Path: /api/bar');
await testPath(308, '/about.html', 'Redirecting to /about/ (308)', { })
Location: '/about/', );
});
await testPath(308, '/sub/index.html', 'Redirecting to /sub/ (308)', { test(
Location: '/sub/', '[vercel dev] Should set the `ts-node` "target" to match Node.js version',
}); testFixtureStdio('node-ts-node-target', async (testPath: any) => {
await testPath(200, `/api/subclass`, '{"ok":true}');
await testPath( await testPath(
308, 200,
'/sub/another.html', `/api/array`,
'Redirecting to /sub/another/ (308)', '{"months":[1,2,3,4,5,6,7,8,9,10,11,12]}'
{
Location: '/sub/another/',
}
); );
})
);
test( await testPath(200, `/api/dump`, (body: any, res: any, isDev: any) => {
'[vercel dev] test cors headers work with OPTIONS', // @ts-ignore
testFixtureStdio('test-cors-routes', async (testPath: any) => { const { host } = new URL(res.url);
const headers = { const { env, headers } = JSON.parse(body);
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Headers':
'Content-Type, Authorization, Accept, Content-Length, Origin, User-Agent',
'Access-Control-Allow-Methods':
'GET, POST, OPTIONS, HEAD, PATCH, PUT, DELETE',
};
await testPath(200, '/', 'status api', headers, { method: 'GET' });
await testPath(200, '/', 'status api', headers, { method: 'POST' });
await testPath(200, '/api/status.js', 'status api', headers, {
method: 'GET',
});
await testPath(200, '/api/status.js', 'status api', headers, {
method: 'POST',
});
await testPath(204, '/', '', headers, { method: 'OPTIONS' });
await testPath(204, '/api/status.js', '', headers, { method: 'OPTIONS' });
})
);
test( // Test that the API endpoint receives the Vercel proxy request headers
'[vercel dev] test trailingSlash true serve correct content', expect(headers['x-forwarded-host']).toBe(host);
testFixtureStdio('test-trailing-slash', async (testPath: any) => { expect(headers['x-vercel-deployment-url']).toBe(host);
await testPath(200, '/', 'Index Page'); expect(isIP(headers['x-real-ip'])).toBeTruthy();
await testPath(200, '/index.html', 'Index Page'); expect(isIP(headers['x-forwarded-for'])).toBeTruthy();
await testPath(200, '/about.html', 'About Page'); expect(isIP(headers['x-vercel-forwarded-for'])).toBeTruthy();
await testPath(200, '/sub/', 'Sub Index Page');
await testPath(200, '/sub/index.html', 'Sub Index Page');
await testPath(200, '/sub/another.html', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
await testPath(308, '/about.html/', 'Redirecting to /about.html (308)', {
Location: '/about.html',
});
await testPath(308, '/style.css/', 'Redirecting to /style.css (308)', {
Location: '/style.css',
});
await testPath(308, '/sub', 'Redirecting to /sub/ (308)', {
Location: '/sub/',
});
})
);
test( // Test that the API endpoint has the Vercel platform env vars defined.
'[vercel dev] should serve custom 404 when `trailingSlash: true`', expect(env.NOW_REGION).toMatch(/^[a-z]{3}\d$/);
testFixtureStdio('test-trailing-slash-custom-404', async (testPath: any) => { if (isDev) {
await testPath(200, '/', 'This is the home page'); // Only dev is tested because in production these are opt-in.
await testPath(200, '/about.html', 'The about page'); expect(env.VERCEL_URL).toBe(host);
await testPath(200, '/contact/', 'Contact Subdirectory'); expect(env.VERCEL_REGION).toBe('dev1');
await testPath(404, '/nothing/', 'Custom 404 Page');
})
);
test(
'[vercel dev] test trailingSlash false serve correct content',
testFixtureStdio('test-trailing-slash-false', async (testPath: any) => {
await testPath(200, '/', 'Index Page');
await testPath(200, '/index.html', 'Index Page');
await testPath(200, '/about.html', 'About Page');
await testPath(200, '/sub', 'Sub Index Page');
await testPath(200, '/sub/index.html', 'Sub Index Page');
await testPath(200, '/sub/another.html', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
await testPath(308, '/about.html/', 'Redirecting to /about.html (308)', {
Location: '/about.html',
});
await testPath(308, '/sub/', 'Redirecting to /sub (308)', {
Location: '/sub',
});
await testPath(
308,
'/sub/another.html/',
'Redirecting to /sub/another.html (308)',
{
Location: '/sub/another.html',
} }
); });
}) })
); );
test( test(
'[vercel dev] throw when invalid builder routes detected', '[vercel dev] Do not fail if `src` is missing',
testFixtureStdio( testFixtureStdio('missing-src-property', async (testPath: any) => {
'invalid-builder-routes', await testPath(200, '/', /hello:index.txt/m);
async (testPath: any) => { await testPath(404, '/i-do-not-exist');
await testPath(
500,
'/',
/Route at index 0 has invalid `src` regular expression/m
);
},
{ skipDeploy: true }
)
);
test(
'[vercel dev] support legacy `@now` scope runtimes',
testFixtureStdio('legacy-now-runtime', async (testPath: any) => {
await testPath(200, '/', /A simple deployment with the Vercel API!/m);
})
);
test(
'[vercel dev] 00-list-directory',
testFixtureStdio(
'00-list-directory',
async (testPath: any) => {
await testPath(200, '/', /Files within/m);
await testPath(200, '/', /test[0-3]\.txt/m);
await testPath(200, '/', /\.well-known/m);
await testPath(200, '/.well-known/keybase.txt', 'proof goes here');
},
{ projectSettings: { directoryListing: true } }
)
);
test(
'[vercel dev] 01-node',
testFixtureStdio('01-node', async (testPath: any) => {
await testPath(200, '/', /A simple deployment with the Vercel API!/m);
})
);
test(
'[vercel dev] add a `api/fn.ts` when `api` does not exist at startup`',
testFixtureStdio('no-api', async (_testPath: any, port: any) => {
const directory = fixture('no-api');
const apiDir = join(directory, 'api');
try {
{
const response = await fetch(`http://localhost:${port}/api/new-file`);
validateResponseHeaders(response);
expect(response.status).toBe(404);
}
const fileContents = `
export const config = {
runtime: 'edge'
}
export default async function edge(request, event) {
return new Response('from new file');
}
`;
await mkdirp(apiDir);
await fs.writeFile(join(apiDir, 'new-file.js'), fileContents);
// Wait until file events have been processed
await sleep(ms('1s'));
{
const response = await fetch(`http://localhost:${port}/api/new-file`);
validateResponseHeaders(response);
const body = await response.text();
expect(body.trim()).toBe('from new file');
}
} finally {
await fs.remove(apiDir);
}
}) })
); );

View File

@@ -1,196 +1,15 @@
import ms from 'ms';
import fs from 'fs-extra'; import fs from 'fs-extra';
import { isIP } from 'net';
import { join } from 'path'; import { join } from 'path';
import { Response } from 'node-fetch'; import { Response } from 'node-fetch';
const { const {
fetch, fetch,
sleep,
fixture, fixture,
testFixture, testFixture,
testFixtureStdio, testFixtureStdio,
validateResponseHeaders, validateResponseHeaders,
} = require('./utils.js'); } = require('./utils.js');
test(
'[vercel dev] temporary directory listing',
testFixtureStdio(
'temporary-directory-listing',
async (_testPath: any, port: any) => {
const directory = fixture('temporary-directory-listing');
await fs.unlink(join(directory, 'index.txt')).catch(() => null);
await sleep(ms('20s'));
const firstResponse = await fetch(`http://localhost:${port}`);
validateResponseHeaders(firstResponse);
const body = await firstResponse.text();
console.log(body);
expect(firstResponse.status).toBe(404);
await fs.writeFile(join(directory, 'index.txt'), 'hello');
for (let i = 0; i < 20; i++) {
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
if (response.status === 200) {
const body = await response.text();
expect(body).toBe('hello');
}
await sleep(ms('1s'));
}
},
{ skipDeploy: true }
)
);
test('[vercel dev] add a `package.json` to trigger `@vercel/static-build`', async () => {
const directory = fixture('trigger-static-build');
await fs.unlink(join(directory, 'package.json')).catch(() => null);
await fs.unlink(join(directory, 'public', 'index.txt')).catch(() => null);
await fs.rmdir(join(directory, 'public')).catch(() => null);
const tester = testFixtureStdio(
'trigger-static-build',
async (_testPath: any, port: any) => {
{
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
const body = await response.text();
expect(body.trim()).toBe('hello:index.txt');
}
const rnd = Math.random().toString();
const pkg = {
private: true,
scripts: { build: `mkdir -p public && echo ${rnd} > public/index.txt` },
};
await fs.writeFile(join(directory, 'package.json'), JSON.stringify(pkg));
// Wait until file events have been processed
await sleep(ms('2s'));
{
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
const body = await response.text();
expect(body.trim()).toBe(rnd);
}
},
{ skipDeploy: true }
);
await tester();
});
test('[vercel dev] no build matches warning', async () => {
const directory = fixture('no-build-matches');
const { dev } = await testFixture(directory, {
stdio: ['ignore', 'pipe', 'pipe'],
});
try {
// start `vercel dev` detached in child_process
dev.unref();
dev.stderr.setEncoding('utf8');
await new Promise<void>(resolve => {
dev.stderr.on('data', (str: string) => {
if (str.includes('did not match any source files')) {
resolve();
}
});
});
} finally {
await dev.kill();
}
});
test(
'[vercel dev] do not recursivly check the path',
testFixtureStdio('handle-filesystem-missing', async (testPath: any) => {
await testPath(200, '/', /hello/m);
await testPath(404, '/favicon.txt');
})
);
test('[vercel dev] render warning for empty cwd dir', async () => {
const directory = fixture('empty');
const { dev, port } = await testFixture(directory, {
stdio: ['ignore', 'pipe', 'pipe'],
});
try {
dev.unref();
// Monitor `stderr` for the warning
dev.stderr.setEncoding('utf8');
const msg = 'There are no files inside your deployment.';
await new Promise<void>(resolve => {
dev.stderr.on('data', (str: string) => {
if (str.includes(msg)) {
resolve();
}
});
});
// Issue a request to ensure a 404 response
await sleep(ms('3s'));
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
expect(response.status).toBe(404);
} finally {
await dev.kill();
}
});
test('[vercel dev] do not rebuild for changes in the output directory', async () => {
const directory = fixture('output-is-source');
const { dev, port } = await testFixture(directory, {
stdio: ['ignore', 'pipe', 'pipe'],
});
try {
dev.unref();
let stderr: any = [];
const start = Date.now();
dev.stderr.on('data', (str: any) => stderr.push(str));
while (stderr.join('').includes('Ready') === false) {
await sleep(ms('3s'));
if (Date.now() - start > ms('30s')) {
console.log('stderr:', stderr.join(''));
break;
}
}
const resp1 = await fetch(`http://localhost:${port}`);
const text1 = await resp1.text();
expect(text1.trim()).toBe('hello first');
await fs.writeFile(join(directory, 'public', 'index.html'), 'hello second');
await sleep(ms('3s'));
const resp2 = await fetch(`http://localhost:${port}`);
const text2 = await resp2.text();
expect(text2.trim()).toBe('hello second');
} finally {
await dev.kill();
}
});
test( test(
'[vercel dev] 25-nextjs-src-dir', '[vercel dev] 25-nextjs-src-dir',
testFixtureStdio('25-nextjs-src-dir', async (testPath: any) => { testFixtureStdio('25-nextjs-src-dir', async (testPath: any) => {
@@ -324,117 +143,6 @@ test(
}) })
); );
test(
'[vercel dev] Use `@vercel/python` with Flask requirements.txt',
testFixtureStdio('python-flask', async (testPath: any) => {
const name = 'Alice';
const year = new Date().getFullYear();
await testPath(200, `/api/user?name=${name}`, new RegExp(`Hello ${name}`));
await testPath(200, `/api/date`, new RegExp(`Current date is ${year}`));
await testPath(200, `/api/date.py`, new RegExp(`Current date is ${year}`));
await testPath(200, `/api/headers`, (body: any, res: any) => {
// @ts-ignore
const { host } = new URL(res.url);
expect(body).toBe(host);
});
})
);
test(
'[vercel dev] Use custom runtime from the "functions" property',
testFixtureStdio('custom-runtime', async (testPath: any) => {
await testPath(200, `/api/user`, /Hello, from Bash!/m);
await testPath(200, `/api/user.sh`, /Hello, from Bash!/m);
})
);
test(
'[vercel dev] Should work with nested `tsconfig.json` files',
testFixtureStdio('nested-tsconfig', async (testPath: any) => {
await testPath(200, `/`, /Nested tsconfig.json test page/);
await testPath(200, `/api`, 'Nested `tsconfig.json` API endpoint');
})
);
test(
'[vercel dev] Should force `tsc` option "module: commonjs" for `startDevServer()`',
testFixtureStdio('force-module-commonjs', async (testPath: any) => {
await testPath(200, `/`, /Force &quot;module: commonjs&quot; test page/);
await testPath(
200,
`/api`,
'Force "module: commonjs" JavaScript with ES Modules API endpoint'
);
await testPath(
200,
`/api/ts`,
'Force "module: commonjs" TypeScript API endpoint'
);
})
);
test(
'[vercel dev] should prioritize index.html over other file named index.*',
testFixtureStdio('index-html-priority', async (testPath: any) => {
await testPath(200, '/', 'This is index.html');
await testPath(200, '/index.css', 'This is index.css');
})
);
test(
'[vercel dev] Should support `*.go` API serverless functions',
testFixtureStdio('go', async (testPath: any) => {
await testPath(200, `/api`, 'This is the index page');
await testPath(200, `/api/index`, 'This is the index page');
await testPath(200, `/api/index.go`, 'This is the index page');
await testPath(200, `/api/another`, 'This is another page');
await testPath(200, '/api/another.go', 'This is another page');
await testPath(200, `/api/foo`, 'Req Path: /api/foo');
await testPath(200, `/api/bar`, 'Req Path: /api/bar');
})
);
test(
'[vercel dev] Should set the `ts-node` "target" to match Node.js version',
testFixtureStdio('node-ts-node-target', async (testPath: any) => {
await testPath(200, `/api/subclass`, '{"ok":true}');
await testPath(
200,
`/api/array`,
'{"months":[1,2,3,4,5,6,7,8,9,10,11,12]}'
);
await testPath(200, `/api/dump`, (body: any, res: any, isDev: any) => {
// @ts-ignore
const { host } = new URL(res.url);
const { env, headers } = JSON.parse(body);
// Test that the API endpoint receives the Vercel proxy request headers
expect(headers['x-forwarded-host']).toBe(host);
expect(headers['x-vercel-deployment-url']).toBe(host);
expect(isIP(headers['x-real-ip'])).toBeTruthy();
expect(isIP(headers['x-forwarded-for'])).toBeTruthy();
expect(isIP(headers['x-vercel-forwarded-for'])).toBeTruthy();
// Test that the API endpoint has the Vercel platform env vars defined.
expect(env.NOW_REGION).toMatch(/^[a-z]{3}\d$/);
if (isDev) {
// Only dev is tested because in production these are opt-in.
expect(env.VERCEL_URL).toBe(host);
expect(env.VERCEL_REGION).toBe('dev1');
}
});
})
);
test(
'[vercel dev] Do not fail if `src` is missing',
testFixtureStdio('missing-src-property', async (testPath: any) => {
await testPath(200, '/', /hello:index.txt/m);
await testPath(404, '/i-do-not-exist');
})
);
test( test(
'[vercel dev] Middleware that returns a 200 response', '[vercel dev] Middleware that returns a 200 response',
testFixtureStdio('middleware-response', async (testPath: any) => { testFixtureStdio('middleware-response', async (testPath: any) => {

View File

@@ -0,0 +1,449 @@
import { join } from 'path';
import ms from 'ms';
import fs, { mkdirp } from 'fs-extra';
const {
fetch,
fixture,
sleep,
testFixture,
testFixtureStdio,
validateResponseHeaders,
} = require('./utils.js');
test(
'[vercel dev] temporary directory listing',
testFixtureStdio(
'temporary-directory-listing',
async (_testPath: any, port: any) => {
const directory = fixture('temporary-directory-listing');
await fs.unlink(join(directory, 'index.txt')).catch(() => null);
await sleep(ms('20s'));
const firstResponse = await fetch(`http://localhost:${port}`);
validateResponseHeaders(firstResponse);
const body = await firstResponse.text();
console.log(body);
expect(firstResponse.status).toBe(404);
await fs.writeFile(join(directory, 'index.txt'), 'hello');
for (let i = 0; i < 20; i++) {
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
if (response.status === 200) {
const body = await response.text();
expect(body).toBe('hello');
}
await sleep(ms('1s'));
}
},
{ skipDeploy: true }
)
);
test('[vercel dev] add a `package.json` to trigger `@vercel/static-build`', async () => {
const directory = fixture('trigger-static-build');
await fs.unlink(join(directory, 'package.json')).catch(() => null);
await fs.unlink(join(directory, 'public', 'index.txt')).catch(() => null);
await fs.rmdir(join(directory, 'public')).catch(() => null);
const tester = testFixtureStdio(
'trigger-static-build',
async (_testPath: any, port: any) => {
{
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
const body = await response.text();
expect(body.trim()).toBe('hello:index.txt');
}
const rnd = Math.random().toString();
const pkg = {
private: true,
scripts: { build: `mkdir -p public && echo ${rnd} > public/index.txt` },
};
await fs.writeFile(join(directory, 'package.json'), JSON.stringify(pkg));
// Wait until file events have been processed
await sleep(ms('2s'));
{
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
const body = await response.text();
expect(body.trim()).toBe(rnd);
}
},
{ skipDeploy: true }
);
await tester();
});
test('[vercel dev] no build matches warning', async () => {
const directory = fixture('no-build-matches');
const { dev } = await testFixture(directory, {
stdio: ['ignore', 'pipe', 'pipe'],
});
try {
// start `vercel dev` detached in child_process
dev.unref();
dev.stderr.setEncoding('utf8');
await new Promise<void>(resolve => {
dev.stderr.on('data', (str: string) => {
if (str.includes('did not match any source files')) {
resolve();
}
});
});
} finally {
await dev.kill();
}
});
test(
'[vercel dev] do not recursivly check the path',
testFixtureStdio('handle-filesystem-missing', async (testPath: any) => {
await testPath(200, '/', /hello/m);
await testPath(404, '/favicon.txt');
})
);
test('[vercel dev] render warning for empty cwd dir', async () => {
const directory = fixture('empty');
const { dev, port } = await testFixture(directory, {
stdio: ['ignore', 'pipe', 'pipe'],
});
try {
dev.unref();
// Monitor `stderr` for the warning
dev.stderr.setEncoding('utf8');
const msg = 'There are no files inside your deployment.';
await new Promise<void>(resolve => {
dev.stderr.on('data', (str: string) => {
if (str.includes(msg)) {
resolve();
}
});
});
// Issue a request to ensure a 404 response
await sleep(ms('3s'));
const response = await fetch(`http://localhost:${port}`);
validateResponseHeaders(response);
expect(response.status).toBe(404);
} finally {
await dev.kill();
}
});
test('[vercel dev] do not rebuild for changes in the output directory', async () => {
const directory = fixture('output-is-source');
const { dev, port } = await testFixture(directory, {
stdio: ['ignore', 'pipe', 'pipe'],
});
try {
dev.unref();
let stderr: any = [];
const start = Date.now();
dev.stderr.on('data', (str: any) => stderr.push(str));
while (stderr.join('').includes('Ready') === false) {
await sleep(ms('3s'));
if (Date.now() - start > ms('30s')) {
console.log('stderr:', stderr.join(''));
break;
}
}
const resp1 = await fetch(`http://localhost:${port}`);
const text1 = await resp1.text();
expect(text1.trim()).toBe('hello first');
await fs.writeFile(join(directory, 'public', 'index.html'), 'hello second');
await sleep(ms('3s'));
const resp2 = await fetch(`http://localhost:${port}`);
const text2 = await resp2.text();
expect(text2.trim()).toBe('hello second');
} finally {
await dev.kill();
}
});
test(
'[vercel dev] test cleanUrls serve correct content',
testFixtureStdio('test-clean-urls', async (testPath: any) => {
await testPath(200, '/', 'Index Page');
await testPath(200, '/about', 'About Page');
await testPath(200, '/sub', 'Sub Index Page');
await testPath(200, '/sub/another', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
await testPath(308, '/index.html', 'Redirecting to / (308)', {
Location: '/',
});
await testPath(308, '/about.html', 'Redirecting to /about (308)', {
Location: '/about',
});
await testPath(308, '/sub/index.html', 'Redirecting to /sub (308)', {
Location: '/sub',
});
await testPath(
308,
'/sub/another.html',
'Redirecting to /sub/another (308)',
{ Location: '/sub/another' }
);
})
);
test(
'[vercel dev] test cleanUrls serve correct content when using `outputDirectory`',
testFixtureStdio(
'test-clean-urls-with-output-directory',
async (testPath: any) => {
await testPath(200, '/', 'Index Page');
await testPath(200, '/about', 'About Page');
await testPath(200, '/sub', 'Sub Index Page');
await testPath(200, '/sub/another', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
await testPath(308, '/index.html', 'Redirecting to / (308)', {
Location: '/',
});
await testPath(308, '/about.html', 'Redirecting to /about (308)', {
Location: '/about',
});
await testPath(308, '/sub/index.html', 'Redirecting to /sub (308)', {
Location: '/sub',
});
await testPath(
308,
'/sub/another.html',
'Redirecting to /sub/another (308)',
{ Location: '/sub/another' }
);
}
)
);
test(
'[vercel dev] should serve custom 404 when `cleanUrls: true`',
testFixtureStdio('test-clean-urls-custom-404', async (testPath: any) => {
await testPath(200, '/', 'This is the home page');
await testPath(200, '/about', 'The about page');
await testPath(200, '/contact/me', 'Contact Me Subdirectory');
await testPath(404, '/nothing', 'Custom 404 Page');
await testPath(404, '/nothing/', 'Custom 404 Page');
})
);
test(
'[vercel dev] test cleanUrls and trailingSlash serve correct content',
testFixtureStdio('test-clean-urls-trailing-slash', async (testPath: any) => {
await testPath(200, '/', 'Index Page');
await testPath(200, '/about/', 'About Page');
await testPath(200, '/sub/', 'Sub Index Page');
await testPath(200, '/sub/another/', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
//TODO: fix this test so that location is `/` instead of `//`
//await testPath(308, '/index.html', 'Redirecting to / (308)', { Location: '/' });
await testPath(308, '/about.html', 'Redirecting to /about/ (308)', {
Location: '/about/',
});
await testPath(308, '/sub/index.html', 'Redirecting to /sub/ (308)', {
Location: '/sub/',
});
await testPath(
308,
'/sub/another.html',
'Redirecting to /sub/another/ (308)',
{
Location: '/sub/another/',
}
);
})
);
test(
'[vercel dev] test cors headers work with OPTIONS',
testFixtureStdio('test-cors-routes', async (testPath: any) => {
const headers = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Headers':
'Content-Type, Authorization, Accept, Content-Length, Origin, User-Agent',
'Access-Control-Allow-Methods':
'GET, POST, OPTIONS, HEAD, PATCH, PUT, DELETE',
};
await testPath(200, '/', 'status api', headers, { method: 'GET' });
await testPath(200, '/', 'status api', headers, { method: 'POST' });
await testPath(200, '/api/status.js', 'status api', headers, {
method: 'GET',
});
await testPath(200, '/api/status.js', 'status api', headers, {
method: 'POST',
});
await testPath(204, '/', '', headers, { method: 'OPTIONS' });
await testPath(204, '/api/status.js', '', headers, { method: 'OPTIONS' });
})
);
test(
'[vercel dev] test trailingSlash true serve correct content',
testFixtureStdio('test-trailing-slash', async (testPath: any) => {
await testPath(200, '/', 'Index Page');
await testPath(200, '/index.html', 'Index Page');
await testPath(200, '/about.html', 'About Page');
await testPath(200, '/sub/', 'Sub Index Page');
await testPath(200, '/sub/index.html', 'Sub Index Page');
await testPath(200, '/sub/another.html', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
await testPath(308, '/about.html/', 'Redirecting to /about.html (308)', {
Location: '/about.html',
});
await testPath(308, '/style.css/', 'Redirecting to /style.css (308)', {
Location: '/style.css',
});
await testPath(308, '/sub', 'Redirecting to /sub/ (308)', {
Location: '/sub/',
});
})
);
test(
'[vercel dev] should serve custom 404 when `trailingSlash: true`',
testFixtureStdio('test-trailing-slash-custom-404', async (testPath: any) => {
await testPath(200, '/', 'This is the home page');
await testPath(200, '/about.html', 'The about page');
await testPath(200, '/contact/', 'Contact Subdirectory');
await testPath(404, '/nothing/', 'Custom 404 Page');
})
);
test(
'[vercel dev] test trailingSlash false serve correct content',
testFixtureStdio('test-trailing-slash-false', async (testPath: any) => {
await testPath(200, '/', 'Index Page');
await testPath(200, '/index.html', 'Index Page');
await testPath(200, '/about.html', 'About Page');
await testPath(200, '/sub', 'Sub Index Page');
await testPath(200, '/sub/index.html', 'Sub Index Page');
await testPath(200, '/sub/another.html', 'Sub Another Page');
await testPath(200, '/style.css', 'body { color: green }');
await testPath(308, '/about.html/', 'Redirecting to /about.html (308)', {
Location: '/about.html',
});
await testPath(308, '/sub/', 'Redirecting to /sub (308)', {
Location: '/sub',
});
await testPath(
308,
'/sub/another.html/',
'Redirecting to /sub/another.html (308)',
{
Location: '/sub/another.html',
}
);
})
);
test(
'[vercel dev] throw when invalid builder routes detected',
testFixtureStdio(
'invalid-builder-routes',
async (testPath: any) => {
await testPath(
500,
'/',
/Route at index 0 has invalid `src` regular expression/m
);
},
{ skipDeploy: true }
)
);
test(
'[vercel dev] support legacy `@now` scope runtimes',
testFixtureStdio('legacy-now-runtime', async (testPath: any) => {
await testPath(200, '/', /A simple deployment with the Vercel API!/m);
})
);
test(
'[vercel dev] 00-list-directory',
testFixtureStdio(
'00-list-directory',
async (testPath: any) => {
await testPath(200, '/', /Files within/m);
await testPath(200, '/', /test[0-3]\.txt/m);
await testPath(200, '/', /\.well-known/m);
await testPath(200, '/.well-known/keybase.txt', 'proof goes here');
},
{ projectSettings: { directoryListing: true } }
)
);
test(
'[vercel dev] 01-node',
testFixtureStdio('01-node', async (testPath: any) => {
await testPath(200, '/', /A simple deployment with the Vercel API!/m);
})
);
test(
'[vercel dev] add a `api/fn.ts` when `api` does not exist at startup`',
testFixtureStdio('no-api', async (_testPath: any, port: any) => {
const directory = fixture('no-api');
const apiDir = join(directory, 'api');
try {
{
const response = await fetch(`http://localhost:${port}/api/new-file`);
validateResponseHeaders(response);
expect(response.status).toBe(404);
}
const fileContents = `
export const config = {
runtime: 'edge'
}
export default async function edge(request, event) {
return new Response('from new file');
}
`;
await mkdirp(apiDir);
await fs.writeFile(join(apiDir, 'new-file.js'), fileContents);
// Wait until file events have been processed
await sleep(ms('1s'));
{
const response = await fetch(`http://localhost:${port}/api/new-file`);
validateResponseHeaders(response);
const body = await response.text();
expect(body.trim()).toBe('from new file');
}
} finally {
await fs.remove(apiDir);
}
})
);

View File

@@ -0,0 +1,5 @@
{
"private": true,
"name": "cli-test-fixtures",
"description": "We created package.json here to avoid reading the monorepo package.json during testing"
}

View File

@@ -0,0 +1,4 @@
{
"name": "next",
"version": "13.0.4"
}

View File

@@ -0,0 +1,5 @@
{
"dependencies": {
"next": "13.0.4"
}
}

Some files were not shown because too many files have changed in this diff Show More