Compare commits

...

56 Commits

Author SHA1 Message Date
Leo Lamprecht
d3ef240f6e Publish Canary
- @vercel/build-utils@2.12.3-canary.42
 - vercel@23.1.3-canary.67
 - @vercel/client@10.2.3-canary.45
 - vercel-plugin-middleware@0.0.0-canary.19
 - vercel-plugin-go@1.0.0-canary.30
 - vercel-plugin-node@1.12.2-canary.34
 - vercel-plugin-python@1.0.0-canary.31
 - vercel-plugin-ruby@1.0.0-canary.30
 - @vercel/python@2.1.2-canary.2
2021-12-08 15:53:14 +01:00
Leo Lamprecht
5b26ebc7b8 Make Python CLI Plugin work (#7155) 2021-12-08 15:52:43 +01:00
Leo Lamprecht
3427ad6ce0 Publish Canary
- vercel@23.1.3-canary.66
2021-12-08 12:50:58 +01:00
Leo Lamprecht
4ab5e4326b Improved Vercel CLI link (#7151) 2021-12-08 12:50:27 +01:00
Leo Lamprecht
d24a3ce3ab Publish Canary
- @vercel/client@10.2.3-canary.44
2021-12-08 12:10:44 +01:00
Steven
29a44db8d9 [client] Fix duplicate files when analyzing nft.json (#7150)
This PR fixes a regression from #7144 where a duplicate file was added if `nft.json` referenced an existing file.

I also refactored the prepared files logic to avoid cloning the array in every loop iteration.

Review [without whitespace](https://github.com/vercel/vercel/pull/7150/files?diff=split&w=1) will make it easier to understand.

- Related to https://github.com/vercel/runtimes/issues/304
2021-12-08 01:40:49 +00:00
Steven
695f3a9212 Publish Canary
- vercel@23.1.3-canary.65
 - @vercel/client@10.2.3-canary.43
 - vercel-plugin-middleware@0.0.0-canary.18
2021-12-07 18:25:02 -05:00
Steven
3ff777b8ed [client] Resolve .nft.json files when vc deploy --prebuilt (#7144)
This ensures that using `vc deploy --prebuilt` will also upload any files that `.output/**/.nft.json` points to and also handle the Root Directory correctly since `vc build` emits `rootdir/.output`.

- Related to https://github.com/vercel/runtimes/issues/304.
2021-12-07 18:17:58 -05:00
Tommaso De Rossi
d94b9806ab [middleware] Define env vars when building _middleware.js with esbuild (#7087)
### Related Issues

> Fixes #7086
> Related to #7086

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2021-12-07 22:16:46 +00:00
Leo Lamprecht
35c8fc2729 Publish Canary
- @vercel/build-utils@2.12.3-canary.41
 - vercel@23.1.3-canary.64
 - @vercel/client@10.2.3-canary.42
 - vercel-plugin-middleware@0.0.0-canary.17
 - vercel-plugin-go@1.0.0-canary.29
 - vercel-plugin-node@1.12.2-canary.33
 - vercel-plugin-python@1.0.0-canary.30
 - vercel-plugin-ruby@1.0.0-canary.29
2021-12-07 21:13:41 +01:00
Leo Lamprecht
0a468fd6d7 Correctly clean up files for CLI Plugins (#7149)
* Correctly clean up files for CLI Plugins

* Cleaned up the code
2021-12-07 21:13:29 +01:00
Leo Lamprecht
d31ebbabe4 Publish Canary
- @vercel/build-utils@2.12.3-canary.40
 - vercel@23.1.3-canary.63
 - @vercel/client@10.2.3-canary.41
 - vercel-plugin-middleware@0.0.0-canary.16
 - vercel-plugin-go@1.0.0-canary.28
 - vercel-plugin-node@1.12.2-canary.32
 - vercel-plugin-python@1.0.0-canary.29
 - vercel-plugin-ruby@1.0.0-canary.28
2021-12-07 17:46:08 +01:00
Leo Lamprecht
09c9b71adb Adjust import statements inside Runtime launchers (#7148)
* Added basic logic

* Polished basic logic

* Made logic actually replace content

* Perfected the logic

* Added comment

* Simplified logic

* Added another comment

* Added debug log

* More detailed debug log

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Steven <steven@ceriously.com>

* Simpler logic

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>
Co-authored-by: Steven <steven@ceriously.com>
2021-12-07 17:44:59 +01:00
Leo Lamprecht
5975db4d66 Fixed middleware tests (#7146) 2021-12-07 01:10:58 +01:00
Leo Lamprecht
2c86ac654c Publish Canary
- @vercel/build-utils@2.12.3-canary.39
 - vercel@23.1.3-canary.62
 - @vercel/client@10.2.3-canary.40
 - vercel-plugin-middleware@0.0.0-canary.15
 - vercel-plugin-go@1.0.0-canary.27
 - vercel-plugin-node@1.12.2-canary.31
 - vercel-plugin-python@1.0.0-canary.28
 - vercel-plugin-ruby@1.0.0-canary.27
2021-12-06 23:37:05 +01:00
Leo Lamprecht
ca5f066eb9 Simplify NFT output logic for CLI and CLI Plugins (#7143)
* Simplify NFT output logic for CLI Plugins

* Made tests pass

* Remove useless logic from Vercel CLI

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Steven <steven@ceriously.com>

* Simplified CLI code

* Removed useless file

Co-authored-by: Steven <steven@ceriously.com>
2021-12-06 23:35:07 +01:00
Leo Lamprecht
410ef86102 Support nested API Routes and fix handler for CLI Plugins (#7141)
We have identified that the `handler` for Lambdas does not support a dot-preceded path for Ruby, Python, and probably other languages, so we're adjusting the File System API to change `.output` inside the Lambda to something else, which requires version `2` of `functions-manifest.json`.

Furthermore, we're also bumping the `.nft.json` files to version `2`, which allows `output` to be relative to the NFT file itself, so that, inside the Lambda, the behavior mentioned at the top can be applied by the File System API.

As a nice side effect, this will also support nested API Routes, because it'll place all the dependencies next to every API Route, meaning that the launcher will have access to all of them (bundling multiple API Routes or Pages into the same Lambda currently doesn't work for non-Next.js anyways, because of https://github.com/vercel/runtimes/issues/305).

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-06 20:53:07 +00:00
Steven
6792edf32a Publish Canary
- @vercel/build-utils@2.12.3-canary.38
 - vercel@23.1.3-canary.61
 - @vercel/client@10.2.3-canary.39
 - @vercel/frameworks@0.5.1-canary.17
 - vercel-plugin-middleware@0.0.0-canary.14
 - vercel-plugin-go@1.0.0-canary.26
 - vercel-plugin-node@1.12.2-canary.30
 - vercel-plugin-python@1.0.0-canary.27
 - vercel-plugin-ruby@1.0.0-canary.26
2021-12-06 14:36:06 -05:00
Steven
67de167a7e [frameworks][cli] Remove duplicate getFsOutputDir() definitions (#7124) 2021-12-06 14:34:30 -05:00
Leo Lamprecht
0c5c05d90b Prevent CLI Plugins from overwriting files (#7140)
* Prevent CLI Plugins from overwriting files

* Revert "Temporarily remove CLI Plugin linking (#7138)"

This reverts commit d6a5aa4f6d.
2021-12-06 17:22:55 +01:00
Leo Lamprecht
fe43c9c4b2 Publish Canary
- @vercel/build-utils@2.12.3-canary.37
 - vercel@23.1.3-canary.60
 - @vercel/client@10.2.3-canary.38
 - vercel-plugin-middleware@0.0.0-canary.13
 - vercel-plugin-go@1.0.0-canary.25
 - vercel-plugin-node@1.12.2-canary.29
 - vercel-plugin-python@1.0.0-canary.26
 - vercel-plugin-ruby@1.0.0-canary.25
2021-12-06 14:28:12 +01:00
Leo Lamprecht
d6a5aa4f6d Temporarily remove CLI Plugin linking (#7138) 2021-12-06 14:25:22 +01:00
Leo Lamprecht
1c3701628d Publish Canary
- @vercel/build-utils@2.12.3-canary.36
 - vercel@23.1.3-canary.59
 - @vercel/client@10.2.3-canary.37
 - vercel-plugin-middleware@0.0.0-canary.12
 - vercel-plugin-go@1.0.0-canary.24
 - vercel-plugin-node@1.12.2-canary.28
 - vercel-plugin-python@1.0.0-canary.25
 - vercel-plugin-ruby@1.0.0-canary.24
2021-12-04 17:01:20 +01:00
Leo Lamprecht
45689f22ab Correctly position dependencies for CLI Plugins (#7133)
Previously, CLI Plugins would try to mount the user-provided request handler (e.g. `api/test.rb`) at the same position inside the Lambda at which the launcher was located (e.g. `api/test.rb`), which would cause the launcher to be overwritten.

With this change, all the destination mounting points for NFT input files are becoming relative to the `.output/server/pages/api` directory instead of `.output/server/pages`, so they should no longer overwrite the launcher, and instead be loaded from the launcher, like normal.

This PR might have two problems:

- If imports in Ruby/Python/etc are relative to `cwd` and not the file from which the import is executed (which I doubt), this would fail.
- We might have to replace `api` with the exact sub folder of the API Route within `.output/server/pages/api` if the current change doesn't yet work for nested paths. Although that also means repeating all of the other dependencies (not just the user-provided request handler) in a different location for every single API Route.

The two above will be tested after this PR was merged, as there currently isn't a way to test `vercel-plugin-go`, `vercel-plugin-python`, and `vercel-plugin-ruby` without publishing a canary, because they don't bundle `@vercel/build-utils`, which was the package that was just updated.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-04 16:00:15 +00:00
Leo Lamprecht
2c3ddffaac Publish Canary
- @vercel/build-utils@2.12.3-canary.35
 - vercel@23.1.3-canary.58
 - @vercel/client@10.2.3-canary.36
 - vercel-plugin-middleware@0.0.0-canary.11
 - vercel-plugin-go@1.0.0-canary.23
 - vercel-plugin-node@1.12.2-canary.27
 - vercel-plugin-python@1.0.0-canary.24
 - vercel-plugin-ruby@1.0.0-canary.23
 - @vercel/ruby@1.2.10-canary.0
2021-12-04 01:30:37 +01:00
Leo Lamprecht
c3ea0195c2 Fixed Lambda handler for compiled languages (#7129)
* Use correct Lambda handler for compiled languages

* Use the correct entrypoint

* Fixed the logic

* Perfected it

* Simpler code

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>

* Handle edge cases and simplify logic

* Fixed the logic yet again to work perfectly

* Normalize Page path

* Simplified everything

* Fixed the tests

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>
2021-12-04 01:29:46 +01:00
Gary Borton
5f5e50cff0 [middleware] Make edge functions be in "strict mode" (#7118)
Co-authored-by: Leo Lamprecht <leo@vercel.com>
Co-authored-by: Steven <steven@ceriously.com>
Co-authored-by: Nathan Rajlich <n@n8.io>
2021-12-03 15:22:21 -08:00
Steven
160f4d46d9 Publish Stable
- @vercel/ruby@1.2.9
2021-12-03 16:39:00 -05:00
Steven
8d619bd7cc Publish Canary
- vercel@23.1.3-canary.57
 - vercel-plugin-ruby@1.0.0-canary.22
 - @vercel/ruby@1.2.8-canary.7
2021-12-03 15:43:53 -05:00
Steven
b94337d842 [ruby] Show error when Ruby 2.5.x detected (#7126)
* [ruby] Show error when Ruby 2.5.x detected

* Add test with ruby 2.5.x
2021-12-03 14:34:01 -05:00
Leo Lamprecht
34f4222ca2 Publish Canary
- @vercel/build-utils@2.12.3-canary.34
 - vercel@23.1.3-canary.56
 - @vercel/client@10.2.3-canary.35
 - vercel-plugin-middleware@0.0.0-canary.10
 - vercel-plugin-go@1.0.0-canary.22
 - vercel-plugin-node@1.12.2-canary.26
 - vercel-plugin-python@1.0.0-canary.23
 - vercel-plugin-ruby@1.0.0-canary.21
2021-12-03 20:15:25 +01:00
Leo Lamprecht
5de045edd7 Use correct Lambda handler for CLI Plugins (#7128)
* Use correct Lambda handler for CLI Plugins

* Tweaked comment

* Fixed tests
2021-12-03 20:15:05 +01:00
Leo Lamprecht
5efd3b98de Publish Canary
- @vercel/build-utils@2.12.3-canary.33
 - vercel@23.1.3-canary.55
 - @vercel/client@10.2.3-canary.34
 - vercel-plugin-middleware@0.0.0-canary.9
 - vercel-plugin-go@1.0.0-canary.21
 - vercel-plugin-node@1.12.2-canary.25
 - vercel-plugin-python@1.0.0-canary.22
 - vercel-plugin-ruby@1.0.0-canary.20
2021-12-03 19:10:29 +01:00
Leo Lamprecht
82c83312c7 Fixed Legacy Runtime output generation (#7127)
* Corrected CLI Plugin output creation

* Wait for the linking to be done before removing anything

* Make it work

* Renamed variables

* Made everything work

* Removed debugging

* Fixed comment typos
2021-12-03 19:09:59 +01:00
Andy Bitz
5ccb983007 Publish Canary
- vercel@23.1.3-canary.54
 - vercel-plugin-middleware@0.0.0-canary.8
2021-12-03 12:12:03 +01:00
Steven
7a921399be [middleware] Fix dependencies (#7125) 2021-12-02 20:38:07 -05:00
Andy
3900f2f982 [cli] Support for next export with vercel build (#7122)
* [cli] Support `next export` in `vercel build`

* Add debug line

* Remove unused import

* Ensure file is copied

* Return in `getNextExportStatus`

* Include dotNextDir

Co-authored-by: Steven <steven@ceriously.com>
2021-12-02 20:31:17 -05:00
Steven
09939f1e07 Update github codeowners (#7123)
* Add gary and javi

* Remove timothy and coetry

* Remove rdev

* Add jaredpalmer

Co-authored-by: Nathan Rajlich <n@n8.io>
2021-12-02 19:20:18 -05:00
Leo Lamprecht
fc3a3ca81f Use Functions Manifest for Middleware (#7119)
### Related Issues

This fixes https://github.com/vercel/runtimes/issues/299.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-03 00:12:01 +00:00
Leo Lamprecht
ba7bf2e4a6 Publish Canary
- @vercel/build-utils@2.12.3-canary.32
 - vercel@23.1.3-canary.53
 - @vercel/client@10.2.3-canary.33
 - vercel-plugin-go@1.0.0-canary.20
 - vercel-plugin-node@1.12.2-canary.24
 - vercel-plugin-python@1.0.0-canary.21
 - vercel-plugin-ruby@1.0.0-canary.19
2021-12-02 23:51:51 +01:00
Leo Lamprecht
00641037fc Corrected input paths for CLI Plugins (#7121)
* Use correct paths for outputs

* Fixed tests

This reverts commit 7c4baeaafaf41609f47c97a09f5e9647fd8b89ee.

* Revert "Fixed tests"

This reverts commit 59c10d18c63f8404c3b0c361c3769b62524316f1.

* Revert "Use correct paths for outputs"

This reverts commit 23a0b34fad1e4932755a39975ae1dfa07acb2dd9.

* Corrected input paths for CLI Plugins

* Fixed tests

This reverts commit 7c4baeaafaf41609f47c97a09f5e9647fd8b89ee.

* Revert "Fixed tests"

This reverts commit 9612d2a9eb19240a5a4489406ada17a6a5bb3806.

* Fixed tests

* Delete vc__handler__python.py
2021-12-02 23:51:39 +01:00
Leo Lamprecht
6f4a1b527b Publish Canary
- @vercel/build-utils@2.12.3-canary.31
 - vercel@23.1.3-canary.52
 - @vercel/client@10.2.3-canary.32
 - vercel-plugin-go@1.0.0-canary.19
 - vercel-plugin-node@1.12.2-canary.23
 - vercel-plugin-python@1.0.0-canary.20
 - vercel-plugin-ruby@1.0.0-canary.18
2021-12-02 22:41:17 +01:00
Leo Lamprecht
1b95576dd2 Fixed Go, Ruby, and Python CLI Plugin output generation (#7117)
* Fixed error with API directory

* Made output work

* Use handler as API Route

* Correctly find the handler

* Fixed a missing instance

* Made handler logic work

* Made it work as expected

* Exclude unnecessary files

* Use a method that always works

* Additional comment

* Made everything work

* Cleaner tests

* Clean up all the useless files

* Fixed missing instance

* Speed up the code

* Removed useless lines

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>

* Clarified comment

* Use relative logic again

* Fixed tests

* Deleted useless file

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>
2021-12-02 22:18:43 +01:00
Leo Lamprecht
9227471aca Publish Canary
- @vercel/build-utils@2.12.3-canary.30
 - vercel@23.1.3-canary.51
 - @vercel/client@10.2.3-canary.31
 - vercel-plugin-go@1.0.0-canary.18
 - vercel-plugin-node@1.12.2-canary.22
 - vercel-plugin-python@1.0.0-canary.19
 - vercel-plugin-ruby@1.0.0-canary.17
2021-12-02 15:24:34 +01:00
Leo Lamprecht
bf060296eb Fixed typo (#7115) 2021-12-02 15:19:32 +01:00
Leo Lamprecht
9b3aa41f2e Publish Canary
- @vercel/build-utils@2.12.3-canary.29
 - vercel@23.1.3-canary.50
 - @vercel/client@10.2.3-canary.30
 - vercel-plugin-go@1.0.0-canary.17
 - vercel-plugin-node@1.12.2-canary.21
 - vercel-plugin-python@1.0.0-canary.18
 - vercel-plugin-ruby@1.0.0-canary.16
2021-12-02 15:02:00 +01:00
Leo Lamprecht
ae36585cdb Filter CLI Plugin output (#7114)
* Filter CLI Plugin output

* Only apply ignorefile check in the beginning
2021-12-02 15:01:17 +01:00
Leo Lamprecht
e4c636ddd2 Publish Canary
- vercel-plugin-go@1.0.0-canary.16
 - vercel-plugin-python@1.0.0-canary.17
 - vercel-plugin-ruby@1.0.0-canary.15
2021-12-02 14:14:14 +01:00
Leo Lamprecht
ae3b25be4b Made CLI Plugin publishing work (#7113) 2021-12-02 14:10:23 +01:00
Andy Bitz
a64ed13a40 Publish Canary
- vercel@23.1.3-canary.49
2021-12-02 12:55:04 +01:00
Andy
6c1c0e6676 [cli] Ignore required-server-files.json if it does not exist (#7111) 2021-12-02 12:54:30 +01:00
Leo Lamprecht
82fdd5d121 Publish Canary
- vercel@23.1.3-canary.48
 - vercel-plugin-go@1.0.0-canary.15
 - vercel-plugin-node@1.12.2-canary.20
 - vercel-plugin-python@1.0.0-canary.16
 - vercel-plugin-ruby@1.0.0-canary.14
 - @vercel/static-config@0.0.1-canary.1
2021-12-02 11:46:28 +01:00
Nathan Rajlich
8b40f4435e [api] Use the new File System API (#7108)
Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com>
2021-12-02 11:45:57 +01:00
Leo Lamprecht
38c87602bb Renamed runtime to use in static JS config (#7106)
### Related Issues

This applies what was mentioned in https://github.com/vercel/runtimes/issues/288#issuecomment-984101750.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-02 00:37:09 +00:00
Nathan Rajlich
7aef3013e7 [cli] Use "127.0.0.1" instead of "localhost" in vc dev (#7094)
Node.js doesn't like when a hostname resolves to an IPv6 address (https://stackoverflow.com/a/15244890/376773) so use the IPv4 localhost IP address instead. Specifically this fixes vc dev on Node.js 17 which now prefers IPv6 by default.

Slack thread: https://vercel.slack.com/archives/C01A2M9R8RZ/p1638330248263400
2021-12-01 23:29:26 +00:00
Nathan Rajlich
c18676ab4d Publish Canary
- vercel-plugin-go@1.0.0-canary.14
 - vercel-plugin-python@1.0.0-canary.15
 - vercel-plugin-ruby@1.0.0-canary.13
2021-12-01 14:28:40 -08:00
74 changed files with 1021 additions and 642 deletions

14
.github/CODEOWNERS vendored
View File

@@ -4,24 +4,26 @@
* @TooTallNate
/.github/workflows @AndyBitz @styfle
/packages/frameworks @AndyBitz
/packages/cli/src/commands/build @TooTallNate @styfle @AndyBitz @gdborton @jaredpalmer
/packages/cli/src/commands/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/util/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/commands/domains @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/certs @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/env @styfle @lucleray
/packages/client @rdev @styfle @TooTallNate
/packages/client @styfle @TooTallNate
/packages/build-utils @styfle @AndyBitz @TooTallNate
/packages/middleware @gdborton @javivelasco
/packages/node @styfle @TooTallNate @lucleray
/packages/node-bridge @styfle @TooTallNate @lucleray
/packages/next @Timer @ijjk
/packages/go @styfle @TooTallNate
/packages/python @styfle @TooTallNate
/packages/ruby @styfle @coetry @TooTallNate
/packages/ruby @styfle @TooTallNate
/packages/static-build @styfle @AndyBitz
/packages/routing-utils @styfle @dav-is @ijjk
/examples @mcsdevv @timothyis
/examples @mcsdevv
/examples/create-react-app @Timer
/examples/nextjs @timneutkens @Timer
/examples/hugo @mcsdevv @timothyis @styfle
/examples/jekyll @mcsdevv @timothyis @styfle
/examples/zola @mcsdevv @timothyis @styfle
/examples/hugo @mcsdevv @styfle
/examples/jekyll @mcsdevv @styfle
/examples/zola @mcsdevv @styfle

View File

@@ -5,7 +5,7 @@
"description": "API for the vercel/vercel repo",
"main": "index.js",
"scripts": {
"vercel-build": "yarn --cwd .. && node ../utils/run.js build all"
"vercel-build": "node ../utils/run.js build all"
},
"dependencies": {
"@sentry/node": "5.11.1",

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/build-utils",
"version": "2.12.3-canary.28",
"version": "2.12.3-canary.42",
"license": "MIT",
"main": "./dist/index.js",
"types": "./dist/index.d.js",
@@ -30,7 +30,7 @@
"@types/node-fetch": "^2.1.6",
"@types/semver": "6.0.0",
"@types/yazl": "^2.4.1",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/frameworks": "0.5.1-canary.17",
"@vercel/ncc": "0.24.0",
"aggregate-error": "3.0.1",
"async-retry": "1.2.3",

View File

@@ -1,11 +1,51 @@
import fs from 'fs-extra';
import { join, dirname, relative } from 'path';
import { join, parse, relative, dirname, basename, extname } from 'path';
import glob from './fs/glob';
import { normalizePath } from './fs/normalize-path';
import { FILES_SYMBOL, Lambda } from './lambda';
import type FileBlob from './file-blob';
import type { BuildOptions, Files } from './types';
import { getIgnoreFilter } from '.';
import { debug, getIgnoreFilter } from '.';
// `.output` was already created by the Build Command, so we have
// to ensure its contents don't get bundled into the Lambda. Similarily,
// we don't want to bundle anything from `.vercel` either. Lastly,
// Builders/Runtimes didn't have `vercel.json` or `now.json`.
const ignoredPaths = ['.output', '.vercel', 'vercel.json', 'now.json'];
const shouldIgnorePath = (
file: string,
ignoreFilter: any,
ignoreFile: boolean
) => {
const isNative = ignoredPaths.some(item => {
return file.startsWith(item);
});
if (!ignoreFile) {
return isNative;
}
return isNative || ignoreFilter(file);
};
const getSourceFiles = async (workPath: string, ignoreFilter: any) => {
const list = await glob('**', {
cwd: workPath,
});
// We're not passing this as an `ignore` filter to the `glob` function above,
// so that we can re-use exactly the same `getIgnoreFilter` method that the
// Build Step uses (literally the same code). Note that this exclusion only applies
// when deploying. Locally, another exclusion is needed, which is handled
// further below in the `convertRuntimeToPlugin` function.
for (const file in list) {
if (shouldIgnorePath(file, ignoreFilter, true)) {
delete list[file];
}
}
return list;
};
/**
* Convert legacy Runtime to a Plugin.
@@ -20,40 +60,36 @@ export function convertRuntimeToPlugin(
) {
// This `build()` signature should match `plugin.build()` signature in `vercel build`.
return async function build({ workPath }: { workPath: string }) {
const opts = { cwd: workPath };
const files = await glob('**', opts);
// `.output` was already created by the Build Command, so we have
// to ensure its contents don't get bundled into the Lambda. Similarily,
// we don't want to bundle anything from `.vercel` either. Lastly,
// Builders/Runtimes didn't have `vercel.json` or `now.json`.
const ignoredPaths = ['.output', '.vercel', 'vercel.json', 'now.json'];
// We also don't want to provide any files to Runtimes that were ignored
// through `.vercelignore` or `.nowignore`, because the Build Step does the same.
const ignoreFilter = await getIgnoreFilter(workPath);
// We're not passing this as an `ignore` filter to the `glob` function above,
// so that we can re-use exactly the same `getIgnoreFilter` method that the
// Build Step uses (literally the same code).
for (const file in files) {
const isNative = ignoredPaths.some(item => {
return file.startsWith(item);
});
// Retrieve the files that are currently available on the File System,
// before the Legacy Runtime has even started to build.
const sourceFilesPreBuild = await getSourceFiles(workPath, ignoreFilter);
if (isNative || ignoreFilter(file)) {
delete files[file];
// Instead of doing another `glob` to get all the matching source files,
// we'll filter the list of existing files down to only the ones
// that are matching the entrypoint pattern, so we're first creating
// a clean new list to begin.
const entrypoints = Object.assign({}, sourceFilesPreBuild);
const entrypointMatch = new RegExp(`^api/.*${ext}$`);
// Up next, we'll strip out the files from the list of entrypoints
// that aren't actually considered entrypoints.
for (const file in entrypoints) {
if (!entrypointMatch.test(file)) {
delete entrypoints[file];
}
}
const entrypointPattern = `api/**/*${ext}`;
const entrypoints = await glob(entrypointPattern, opts);
const pages: { [key: string]: any } = {};
const pluginName = packageName.replace('vercel-plugin-', '');
const outputPath = join(workPath, '.output');
const traceDir = join(
workPath,
`.output`,
outputPath,
`inputs`,
// Legacy Runtimes can only provide API Routes, so that's
// why we can use this prefix for all of them. Here, we have to
@@ -64,9 +100,11 @@ export function convertRuntimeToPlugin(
await fs.ensureDir(traceDir);
const entryRoot = join(outputPath, 'server', 'pages');
for (const entrypoint of Object.keys(entrypoints)) {
const { output } = await buildRuntime({
files,
files: sourceFilesPreBuild,
entrypoint,
workPath,
config: {
@@ -74,76 +112,185 @@ export function convertRuntimeToPlugin(
},
meta: {
avoidTopLevelInstall: true,
skipDownload: true,
},
});
pages[entrypoint] = {
handler: output.handler,
// @ts-ignore This symbol is a private API
const lambdaFiles: Files = output[FILES_SYMBOL];
// When deploying, the `files` that are passed to the Legacy Runtimes already
// have certain files that are ignored stripped, but locally, that list of
// files isn't used by the Legacy Runtimes, so we need to apply the filters
// to the outputs that they are returning instead.
for (const file in lambdaFiles) {
if (shouldIgnorePath(file, ignoreFilter, false)) {
delete lambdaFiles[file];
}
}
let handlerFileBase = output.handler;
let handlerFile = lambdaFiles[handlerFileBase];
let handlerHasImport = false;
const { handler } = output;
const handlerMethod = handler.split('.').pop();
const handlerFileName = handler.replace(`.${handlerMethod}`, '');
// For compiled languages, the launcher file for the Lambda generated
// by the Legacy Runtime matches the `handler` defined for it, but for
// interpreted languages, the `handler` consists of the launcher file name
// without an extension, plus the name of the method inside of that file
// that should be invoked, so we have to construct the file path explicitly.
if (!handlerFile) {
handlerFileBase = handlerFileName + ext;
handlerFile = lambdaFiles[handlerFileBase];
handlerHasImport = true;
}
if (!handlerFile || !handlerFile.fsPath) {
throw new Error(
`Could not find a handler file. Please ensure that \`files\` for the returned \`Lambda\` contains an \`FileFsRef\` named "${handlerFileBase}" with a valid \`fsPath\`.`
);
}
const handlerExtName = extname(handlerFile.fsPath);
const entryBase = basename(entrypoint).replace(ext, handlerExtName);
const entryPath = join(dirname(entrypoint), entryBase);
const entry = join(entryRoot, entryPath);
// Create the parent directory of the API Route that will be created
// for the current entrypoint inside of `.output/server/pages/api`.
await fs.ensureDir(dirname(entry));
// For compiled languages, the launcher file will be binary and therefore
// won't try to import a user-provided request handler (instead, it will
// contain it). But for interpreted languages, the launcher might try to
// load a user-provided request handler from the source file instead of bundling
// it, so we have to adjust the import statement inside the launcher to point
// to the respective source file. Previously, Legacy Runtimes simply expected
// the user-provided request-handler to be copied right next to the launcher,
// but with the new File System API, files won't be moved around unnecessarily.
if (handlerHasImport) {
const { fsPath } = handlerFile;
const encoding = 'utf-8';
// This is the true directory of the user-provided request handler in the
// source files, so that's what we will use as an import path in the launcher.
const locationPrefix = relative(entry, outputPath);
let handlerContent = await fs.readFile(fsPath, encoding);
const importPaths = [
// This is the full entrypoint path, like `./api/test.py`. In our tests
// Python didn't support importing from a parent directory without using different
// code in the launcher that registers it as a location for modules and then changing
// the importing syntax, but continuing to import it like before seems to work. If
// other languages need this, we should consider excluding Python explicitly.
// `./${entrypoint}`,
// This is the entrypoint path without extension, like `api/test`
entrypoint.slice(0, -ext.length),
];
// Generate a list of regular expressions that we can use for
// finding matches, but only allow matches if the import path is
// wrapped inside single (') or double quotes (").
const patterns = importPaths.map(path => {
// eslint-disable-next-line no-useless-escape
return new RegExp(`('|")(${path.replace(/\./g, '\\.')})('|")`, 'g');
});
let replacedMatch = null;
for (const pattern of patterns) {
const newContent = handlerContent.replace(
pattern,
(_, p1, p2, p3) => {
return `${p1}${join(locationPrefix, p2)}${p3}`;
}
);
if (newContent !== handlerContent) {
debug(
`Replaced "${pattern}" inside "${entry}" to ensure correct import of user-provided request handler`
);
handlerContent = newContent;
replacedMatch = true;
}
}
if (!replacedMatch) {
new Error(
`No replacable matches for "${importPaths[0]}" or "${importPaths[1]}" found in "${fsPath}"`
);
}
await fs.writeFile(entry, handlerContent, encoding);
} else {
await fs.copy(handlerFile.fsPath, entry);
}
// Legacy Runtimes based on interpreted languages will create a new launcher file
// for every entrypoint, but they will create each one inside `workPath`, which means that
// the launcher for one entrypoint will overwrite the launcher provided for the previous
// entrypoint. That's why, above, we copy the file contents into the new destination (and
// optionally transform them along the way), instead of linking. We then also want to remove
// the copy origin right here, so that the `workPath` doesn't contain a useless launcher file
// once the build has finished running.
await fs.remove(handlerFile.fsPath);
debug(`Removed temporary file "${handlerFile.fsPath}"`);
const nft = `${entry}.nft.json`;
const json = JSON.stringify({
version: 2,
files: Object.keys(lambdaFiles)
.map(file => {
const { fsPath } = lambdaFiles[file];
if (!fsPath) {
throw new Error(
`File "${file}" is missing valid \`fsPath\` property`
);
}
// The handler was already moved into position above.
if (file === handlerFileBase) {
return;
}
return normalizePath(relative(dirname(nft), fsPath));
})
.filter(Boolean),
});
await fs.writeFile(nft, json);
// Add an entry that will later on be added to the `functions-manifest.json`
// file that is placed inside of the `.output` directory.
pages[normalizePath(entryPath)] = {
// Because the underlying file used as a handler was placed
// inside `.output/server/pages/api`, it no longer has the name it originally
// had and is now named after the API Route that it's responsible for,
// so we have to adjust the name of the Lambda handler accordingly.
handler: handler.replace(handlerFileName, parse(entry).name),
runtime: output.runtime,
memory: output.memory,
maxDuration: output.maxDuration,
environment: output.environment,
allowQuery: output.allowQuery,
};
// @ts-ignore This symbol is a private API
const lambdaFiles: Files = output[FILES_SYMBOL];
const entry = join(workPath, '.output', 'server', 'pages', entrypoint);
await fs.ensureDir(dirname(entry));
await linkOrCopy(files[entrypoint].fsPath, entry);
const tracedFiles: {
absolutePath: string;
relativePath: string;
}[] = [];
Object.entries(lambdaFiles).forEach(async ([relPath, file]) => {
const newPath = join(traceDir, relPath);
tracedFiles.push({ absolutePath: newPath, relativePath: relPath });
if (file.fsPath) {
await linkOrCopy(file.fsPath, newPath);
} else if (file.type === 'FileBlob') {
const { data, mode } = file as FileBlob;
await fs.writeFile(newPath, data, { mode });
} else {
throw new Error(`Unknown file type: ${file.type}`);
}
});
const nft = join(
workPath,
'.output',
'server',
'pages',
`${entrypoint}.nft.json`
);
const json = JSON.stringify({
version: 1,
files: tracedFiles.map(f => ({
input: normalizePath(relative(nft, f.absolutePath)),
output: normalizePath(f.relativePath),
})),
});
await fs.ensureDir(dirname(nft));
await fs.writeFile(nft, json);
}
// Add any Serverless Functions that were exposed by the Legacy Runtime
// to the `functions-manifest.json` file provided in `.output`.
await updateFunctionsManifest({ workPath, pages });
};
}
async function linkOrCopy(existingPath: string, newPath: string) {
try {
await fs.createLink(existingPath, newPath);
} catch (err: any) {
if (err.code !== 'EEXIST') {
await fs.copyFile(existingPath, newPath);
}
}
}
async function readJson(filePath: string): Promise<{ [key: string]: any }> {
try {
const str = await fs.readFile(filePath, 'utf8');
@@ -174,7 +321,7 @@ export async function updateFunctionsManifest({
);
const functionsManifest = await readJson(functionsManifestPath);
if (!functionsManifest.version) functionsManifest.version = 1;
if (!functionsManifest.version) functionsManifest.version = 2;
if (!functionsManifest.pages) functionsManifest.pages = {};
for (const [pageKey, pageConfig] of Object.entries(pages)) {

View File

@@ -1,6 +1,6 @@
import { join } from 'path';
import fs from 'fs-extra';
import { BuildOptions, createLambda } from '../src';
import { BuildOptions, createLambda, FileFsRef } from '../src';
import { convertRuntimeToPlugin } from '../src/convert-runtime-to-plugin';
async function fsToJson(dir: string, output: Record<string, any> = {}) {
@@ -32,9 +32,13 @@ describe('convert-runtime-to-plugin', () => {
});
it('should create correct fileystem for python', async () => {
const ext = '.py';
const workPath = pythonApiWorkpath;
const handlerName = 'vc__handler__python';
const handlerFileName = handlerName + ext;
const lambdaOptions = {
handler: 'index.handler',
handler: `${handlerName}.vc_handler`,
runtime: 'python3.9',
memory: 512,
maxDuration: 5,
@@ -42,6 +46,15 @@ describe('convert-runtime-to-plugin', () => {
};
const buildRuntime = async (opts: BuildOptions) => {
const handlerPath = join(workPath, handlerFileName);
// This is the usual time at which a Legacy Runtime writes its Lambda launcher.
await fs.writeFile(handlerPath, '# handler');
opts.files[handlerFileName] = new FileFsRef({
fsPath: handlerPath,
});
const lambda = await createLambda({
files: opts.files,
...lambdaOptions,
@@ -49,10 +62,6 @@ describe('convert-runtime-to-plugin', () => {
return { output: lambda };
};
const lambdaFiles = await fsToJson(workPath);
delete lambdaFiles['vercel.json'];
const ext = '.py';
const packageName = 'vercel-plugin-python';
const build = await convertRuntimeToPlugin(buildRuntime, packageName, ext);
@@ -62,18 +71,15 @@ describe('convert-runtime-to-plugin', () => {
expect(output).toMatchObject({
'functions-manifest.json': expect.stringContaining('{'),
inputs: {
'api-routes-python': lambdaFiles,
},
server: {
pages: {
api: {
'index.py': expect.stringContaining('index'),
'index.py': expect.stringContaining('handler'),
'index.py.nft.json': expect.stringContaining('{'),
users: {
'get.py': expect.stringContaining('get'),
'get.py': expect.stringContaining('handler'),
'get.py.nft.json': expect.stringContaining('{'),
'post.py': expect.stringContaining('post'),
'post.py': expect.stringContaining('handler'),
'post.py.nft.json': expect.stringContaining('{'),
},
},
@@ -83,50 +89,30 @@ describe('convert-runtime-to-plugin', () => {
const funcManifest = JSON.parse(output['functions-manifest.json']);
expect(funcManifest).toMatchObject({
version: 1,
version: 2,
pages: {
'api/index.py': lambdaOptions,
'api/users/get.py': lambdaOptions,
'api/users/post.py': { ...lambdaOptions, memory: 512 },
'api/index.py': { ...lambdaOptions, handler: 'index.vc_handler' },
'api/users/get.py': { ...lambdaOptions, handler: 'get.vc_handler' },
'api/users/post.py': {
...lambdaOptions,
handler: 'post.vc_handler',
memory: 512,
},
},
});
const indexJson = JSON.parse(output.server.pages.api['index.py.nft.json']);
expect(indexJson).toMatchObject({
version: 1,
version: 2,
files: [
{
input: `../../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: `../../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: `../../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: `../../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: `../../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
'../../../../api/db/[id].py',
'../../../../api/index.py',
'../../../../api/project/[aid]/[bid]/index.py',
'../../../../api/users/get.py',
'../../../../api/users/post.py',
'../../../../file.txt',
'../../../../util/date.py',
'../../../../util/math.py',
],
});
@@ -134,40 +120,16 @@ describe('convert-runtime-to-plugin', () => {
output.server.pages.api.users['get.py.nft.json']
);
expect(getJson).toMatchObject({
version: 1,
version: 2,
files: [
{
input: `../../../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: `../../../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input: `../../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: `../../../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: `../../../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: `../../../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: `../../../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: `../../../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
'../../../../../api/db/[id].py',
'../../../../../api/index.py',
'../../../../../api/project/[aid]/[bid]/index.py',
'../../../../../api/users/get.py',
'../../../../../api/users/post.py',
'../../../../../file.txt',
'../../../../../util/date.py',
'../../../../../util/math.py',
],
});
@@ -175,40 +137,16 @@ describe('convert-runtime-to-plugin', () => {
output.server.pages.api.users['post.py.nft.json']
);
expect(postJson).toMatchObject({
version: 1,
version: 2,
files: [
{
input: `../../../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: `../../../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input: `../../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: `../../../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: `../../../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: `../../../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: `../../../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: `../../../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
'../../../../../api/db/[id].py',
'../../../../../api/index.py',
'../../../../../api/project/[aid]/[bid]/index.py',
'../../../../../api/users/get.py',
'../../../../../api/users/post.py',
'../../../../../file.txt',
'../../../../../util/date.py',
'../../../../../util/math.py',
],
});

View File

@@ -34,7 +34,7 @@ Finally, [connect your Git repository to Vercel](https://vercel.com/docs/git) an
## Documentation
For details on how to use Vercel CLI, check out our [documentation](https://vercel.com/docs).
For details on how to use Vercel CLI, check out our [documentation](https://vercel.com/docs/cli).
## Local Development

View File

@@ -1,6 +1,6 @@
{
"name": "vercel",
"version": "23.1.3-canary.47",
"version": "23.1.3-canary.67",
"preferGlobal": true,
"license": "Apache-2.0",
"description": "The command-line interface for Vercel",
@@ -43,14 +43,14 @@
"node": ">= 12"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.28",
"@vercel/build-utils": "2.12.3-canary.42",
"@vercel/go": "1.2.4-canary.4",
"@vercel/node": "1.12.2-canary.7",
"@vercel/python": "2.1.2-canary.1",
"@vercel/ruby": "1.2.8-canary.6",
"@vercel/python": "2.1.2-canary.2",
"@vercel/ruby": "1.2.10-canary.0",
"update-notifier": "4.1.0",
"vercel-plugin-middleware": "0.0.0-canary.7",
"vercel-plugin-node": "1.12.2-canary.19"
"vercel-plugin-middleware": "0.0.0-canary.19",
"vercel-plugin-node": "1.12.2-canary.34"
},
"devDependencies": {
"@next/env": "11.1.2",
@@ -90,7 +90,7 @@
"@types/update-notifier": "5.1.0",
"@types/which": "1.3.2",
"@types/write-json-file": "2.2.1",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/frameworks": "0.5.1-canary.17",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.17.0",
"@zeit/fun": "0.11.2",

View File

@@ -5,16 +5,16 @@ import {
GlobOptions,
scanParentDirs,
spawnAsync,
glob as buildUtilsGlob,
} from '@vercel/build-utils';
import { nodeFileTrace } from '@vercel/nft';
import Sema from 'async-sema';
import chalk from 'chalk';
import { SpawnOptions } from 'child_process';
import { assert } from 'console';
import { createHash } from 'crypto';
import fs from 'fs-extra';
import ogGlob from 'glob';
import { dirname, isAbsolute, join, parse, relative, resolve } from 'path';
import { dirname, isAbsolute, join, parse, relative } from 'path';
import pluralize from 'pluralize';
import Client from '../util/client';
import { VercelConfig } from '../util/dev/types';
@@ -353,13 +353,19 @@ export default async function main(client: Client) {
}
// We cannot rely on the `framework` alone, as it might be a static export,
// and the current build might use a differnt project that's not in the settings.
// and the current build might use a different project that's not in the settings.
const isNextOutput = Boolean(dotNextDir);
const outputDir = isNextOutput ? OUTPUT_DIR : join(OUTPUT_DIR, 'static');
const nextExport = await getNextExportStatus(dotNextDir);
const outputDir =
isNextOutput && !nextExport ? OUTPUT_DIR : join(OUTPUT_DIR, 'static');
const getDistDir = framework.getFsOutputDir || framework.getOutputDirName;
const distDir =
(nextExport?.exportDetail.outDirectory
? relative(cwd, nextExport.exportDetail.outDirectory)
: false) ||
dotNextDir ||
userOutputDirectory ||
(await framework.getFsOutputDir(cwd));
(await getDistDir(cwd));
await fs.ensureDir(join(cwd, outputDir));
@@ -443,7 +449,53 @@ export default async function main(client: Client) {
}
// Special Next.js processing.
if (isNextOutput) {
if (nextExport) {
client.output.debug('Found `next export` output.');
const htmlFiles = await buildUtilsGlob(
'**/*.html',
join(cwd, OUTPUT_DIR, 'static')
);
if (nextExport.exportDetail.success !== true) {
client.output.error(
`Export of Next.js app failed. Please check your build logs.`
);
process.exit(1);
}
await fs.mkdirp(join(cwd, OUTPUT_DIR, 'server', 'pages'));
await fs.mkdirp(join(cwd, OUTPUT_DIR, 'static'));
await Promise.all(
Object.keys(htmlFiles).map(async fileName => {
await sema.acquire();
const input = join(cwd, OUTPUT_DIR, 'static', fileName);
const target = join(cwd, OUTPUT_DIR, 'server', 'pages', fileName);
await fs.mkdirp(dirname(target));
await fs.promises.rename(input, target).finally(() => {
sema.release();
});
})
);
for (const file of [
'BUILD_ID',
'images-manifest.json',
'routes-manifest.json',
'build-manifest.json',
]) {
const input = join(nextExport.dotNextDir, file);
if (fs.existsSync(input)) {
// Do not use `smartCopy`, since we want to overwrite if they already exist.
await fs.copyFile(input, join(OUTPUT_DIR, file));
}
}
} else if (isNextOutput) {
// The contents of `.output/static` should be placed inside of `.output/static/_next/static`
const tempStatic = '___static';
await fs.rename(
@@ -584,42 +636,31 @@ export default async function main(client: Client) {
],
});
fileList.delete(relative(cwd, f));
await resolveNftToOutput({
client,
baseDir,
outputDir: OUTPUT_DIR,
nftFileName: f.replace(ext, '.js.nft.json'),
distDir,
nft: {
version: 1,
const nftFileName = f.replace(ext, '.js.nft.json');
client.output.debug(`Creating ${nftFileName}`);
await fs.writeJSON(nftFileName, {
version: 2,
files: Array.from(fileList).map(fileListEntry =>
relative(dir, fileListEntry)
),
},
});
}
} else {
for (let f of nftFiles) {
const json = await fs.readJson(f);
await resolveNftToOutput({
client,
baseDir,
outputDir: OUTPUT_DIR,
nftFileName: f,
nft: json,
distDir,
});
}
}
client.output.debug(`Resolve ${param('required-server-files.json')}.`);
const requiredServerFilesPath = join(
OUTPUT_DIR,
'required-server-files.json'
);
if (fs.existsSync(requiredServerFilesPath)) {
client.output.debug(`Resolve ${param('required-server-files.json')}.`);
const requiredServerFilesJson = await fs.readJSON(
requiredServerFilesPath
);
await fs.writeJSON(requiredServerFilesPath, {
...requiredServerFilesJson,
appDir: '.',
@@ -627,19 +668,12 @@ export default async function main(client: Client) {
const originalPath = join(requiredServerFilesJson.appDir, i);
const relPath = join(OUTPUT_DIR, relative(distDir, originalPath));
const absolutePath = join(cwd, relPath);
const output = relative(baseDir, absolutePath);
return relPath === output
? relPath
: {
input: relPath,
output,
};
return relPath;
}),
});
}
}
}
// Build Plugins
if (plugins?.buildPlugins && plugins.buildPlugins.length > 0) {
@@ -792,83 +826,51 @@ async function glob(pattern: string, options: GlobOptions): Promise<string[]> {
}
/**
* Computes a hash for the given buf.
*
* @param {Buffer} file data
* @return {String} hex digest
* Files will only exist when `next export` was used.
*/
function hash(buf: Buffer): string {
return createHash('sha1').update(buf).digest('hex');
}
interface NftFile {
version: number;
files: (string | { input: string; output: string })[];
}
// resolveNftToOutput takes nft file and moves all of its trace files
// into the specified directory + `inputs`, (renaming them to their hash + ext) and
// subsequently updating the original nft file accordingly. This is done
// to make the `.output` directory be self-contained, so that it works
// properly with `vc --prebuilt`.
async function resolveNftToOutput({
client,
baseDir,
outputDir,
nftFileName,
distDir,
nft,
}: {
client: Client;
baseDir: string;
outputDir: string;
nftFileName: string;
distDir: string;
nft: NftFile;
}) {
client.output.debug(`Processing and resolving ${nftFileName}`);
await fs.ensureDir(join(outputDir, 'inputs'));
const newFilesList: NftFile['files'] = [];
// If `distDir` is a subdirectory, then the input has to be resolved to where the `.output` directory will be.
const relNftFileName = relative(outputDir, nftFileName);
const origNftFilename = join(distDir, relNftFileName);
if (relNftFileName.startsWith('cache/')) {
// No need to process the `cache/` directory.
// Paths in it might also not be relative to `cache` itself.
return;
async function getNextExportStatus(dotNextDir: string | null) {
if (!dotNextDir) {
return null;
}
for (let fileEntity of nft.files) {
const relativeInput =
typeof fileEntity === 'string' ? fileEntity : fileEntity.input;
const fullInput = resolve(join(parse(origNftFilename).dir, relativeInput));
const exportDetail: {
success: boolean;
outDirectory: string;
} | null = await fs
.readJson(join(dotNextDir, 'export-detail.json'))
.catch(error => {
if (error.code === 'ENOENT') {
return null;
}
// if the resolved path is NOT in the .output directory we move in it there
if (!fullInput.includes(distDir)) {
const { ext } = parse(fullInput);
const raw = await fs.readFile(fullInput);
const newFilePath = join(outputDir, 'inputs', hash(raw) + ext);
smartCopy(client, fullInput, newFilePath);
// We have to use `baseDir` instead of `cwd`, because we want to
// mount everything from there (especially `node_modules`).
// This is important for NPM Workspaces where `node_modules` is not
// in the directory of the workspace.
const output = relative(baseDir, fullInput).replace('.output', '.next');
newFilesList.push({
input: relative(parse(nftFileName).dir, newFilePath),
output,
throw error;
});
} else {
newFilesList.push(relativeInput);
if (!exportDetail) {
return null;
}
const exportMarker: {
version: 1;
exportTrailingSlash: boolean;
hasExportPathMap: boolean;
} | null = await fs
.readJSON(join(dotNextDir, 'export-marker.json'))
.catch(error => {
if (error.code === 'ENOENT') {
return null;
}
// Update the .nft.json with new input and output mapping
await fs.writeJSON(nftFileName, {
...nft,
files: newFilesList,
throw error;
});
return {
dotNextDir,
exportDetail,
exportMarker: {
trailingSlash: exportMarker?.hasExportPathMap
? exportMarker.exportTrailingSlash
: false,
},
};
}

View File

@@ -447,6 +447,7 @@ export default async (client: Client) => {
forceNew: argv['--force'],
withCache: argv['--with-cache'],
prebuilt: argv['--prebuilt'],
rootDirectory,
quiet,
wantsPublic: argv['--public'] || localConfig.public,
isFile,

View File

@@ -52,6 +52,7 @@ export default async function processDeployment({
isSettingUpProject: boolean;
skipAutoDetectionConfirmation?: boolean;
cwd?: string;
rootDirectory?: string;
}) {
let {
now,
@@ -64,6 +65,7 @@ export default async function processDeployment({
nowConfig,
quiet,
prebuilt,
rootDirectory,
} = args;
const { debug } = output;
@@ -86,6 +88,7 @@ export default async function processDeployment({
force,
withCache,
prebuilt,
rootDirectory,
skipAutoDetectionConfirmation,
};

View File

@@ -968,7 +968,7 @@ export default class DevServer {
socket.destroy();
return;
}
const target = `http://localhost:${this.devProcessPort}`;
const target = `http://127.0.0.1:${this.devProcessPort}`;
this.output.debug(`Detected "upgrade" event, proxying to ${target}`);
this.proxy.ws(req, socket, head, { target });
});
@@ -1663,7 +1663,7 @@ export default class DevServer {
if (!match) {
// If the dev command is started, then proxy to it
if (this.devProcessPort) {
const upstream = `http://localhost:${this.devProcessPort}`;
const upstream = `http://127.0.0.1:${this.devProcessPort}`;
debug(`Proxying to frontend dev server: ${upstream}`);
// Add the Vercel platform proxy request headers
@@ -1810,7 +1810,7 @@ export default class DevServer {
return proxyPass(
req,
res,
`http://localhost:${port}`,
`http://127.0.0.1:${port}`,
this,
requestId,
false
@@ -1847,7 +1847,7 @@ export default class DevServer {
return proxyPass(
req,
res,
`http://localhost:${this.devProcessPort}`,
`http://127.0.0.1:${this.devProcessPort}`,
this,
requestId,
false

View File

@@ -37,6 +37,7 @@ export interface CreateOptions {
project?: string;
wantsPublic: boolean;
prebuilt?: boolean;
rootDirectory?: string;
meta: Dictionary<string>;
regions?: string[];
quiet?: boolean;
@@ -113,6 +114,7 @@ export default class Now extends EventEmitter {
name,
project,
prebuilt = false,
rootDirectory,
wantsPublic,
meta,
regions,
@@ -168,6 +170,7 @@ export default class Now extends EventEmitter {
skipAutoDetectionConfirmation,
cwd,
prebuilt,
rootDirectory,
});
if (deployment && deployment.warnings) {

View File

@@ -0,0 +1,6 @@
export default function (req) {
const isStrict = (function () {
return !this;
})();
return new Response('is strict mode? ' + (isStrict ? 'yes' : 'no'));
}

View File

@@ -385,4 +385,13 @@ describe('DevServer', () => {
);
})
);
it(
'should run middleware in strict mode',
testFixture('edge-middleware-strict', async server => {
const response = await fetch(`${server.address}/index.html`);
const body = await response.text();
expect(body).toStrictEqual('is strict mode? yes');
})
);
});

View File

@@ -6,3 +6,5 @@ node_modules
!tests/fixtures/nowignore/node_modules
!tests/fixtures/vercelignore-allow-nodemodules/node_modules
!tests/fixtures/vercelignore-allow-nodemodules/sub/node_modules
!tests/fixtures/file-system-api/.output
!tests/fixtures/file-system-api-root-directory/**/.output

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/client",
"version": "10.2.3-canary.29",
"version": "10.2.3-canary.45",
"main": "dist/index.js",
"typings": "dist/index.d.ts",
"homepage": "https://vercel.com",
@@ -40,7 +40,7 @@
]
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.28",
"@vercel/build-utils": "2.12.3-canary.42",
"@zeit/fetch": "5.2.0",
"async-retry": "1.2.3",
"async-sema": "3.0.0",

View File

@@ -1,12 +1,12 @@
import { lstatSync } from 'fs-extra';
import { relative, isAbsolute } from 'path';
import hashes, { mapToObject } from './utils/hashes';
import { hashes, mapToObject, resolveNftJsonFiles } from './utils/hashes';
import { upload } from './upload';
import { buildFileTree, createDebug, parseVercelConfig } from './utils';
import { DeploymentError } from './errors';
import {
NowConfig,
VercelConfig,
VercelClientOptions,
DeploymentOptions,
DeploymentEventType,
@@ -16,7 +16,7 @@ export default function buildCreateDeployment() {
return async function* createDeployment(
clientOptions: VercelClientOptions,
deploymentOptions: DeploymentOptions = {},
nowConfig: NowConfig = {}
nowConfig: VercelConfig = {}
): AsyncIterableIterator<{ type: DeploymentEventType; payload: any }> {
const { path } = clientOptions;
@@ -74,12 +74,7 @@ export default function buildCreateDeployment() {
debug(`Provided 'path' is a single file`);
}
let { fileList } = await buildFileTree(
path,
clientOptions.isDirectory,
debug,
clientOptions.prebuilt
);
let { fileList } = await buildFileTree(path, clientOptions, debug);
let configPath: string | undefined;
if (!nowConfig) {
@@ -114,7 +109,11 @@ export default function buildCreateDeployment() {
};
}
const files = await hashes(fileList);
const hashedFileMap = await hashes(fileList);
const nftFileList = clientOptions.prebuilt
? await resolveNftJsonFiles(hashedFileMap)
: [];
const files = await hashes(nftFileList, hashedFileMap);
debug(`Yielding a 'hashes-calculated' event with ${files.size} hashes`);
yield { type: 'hashes-calculated', payload: mapToObject(files) };

View File

@@ -15,6 +15,7 @@ export interface VercelClientOptions {
apiUrl?: string;
force?: boolean;
prebuilt?: boolean;
rootDirectory?: string;
withCache?: boolean;
userAgent?: string;
defaultName?: string;

View File

@@ -1,6 +1,7 @@
import { createHash } from 'crypto';
import fs from 'fs-extra';
import { Sema } from 'async-sema';
import { join, dirname } from 'path';
export interface DeploymentFile {
names: string[];
@@ -15,9 +16,7 @@ export interface DeploymentFile {
* @return {String} hex digest
*/
function hash(buf: Buffer): string {
return createHash('sha1')
.update(buf)
.digest('hex');
return createHash('sha1').update(buf).digest('hex');
}
/**
@@ -39,16 +38,18 @@ export const mapToObject = (
/**
* Computes hashes for the contents of each file given.
*
* @param {Array} of {String} full paths
* @return {Map}
* @param files - absolute file paths
* @param map - optional map of files to append
* @return Map of hash digest to file object
*/
async function hashes(files: string[]): Promise<Map<string, DeploymentFile>> {
const map = new Map<string, DeploymentFile>();
export async function hashes(
files: string[],
map = new Map<string, DeploymentFile>()
): Promise<Map<string, DeploymentFile>> {
const semaphore = new Sema(100);
await Promise.all(
files.map(
async (name: string): Promise<void> => {
files.map(async (name: string): Promise<void> => {
await semaphore.acquire();
const data = await fs.readFile(name);
const { mode } = await fs.stat(name);
@@ -57,16 +58,48 @@ async function hashes(files: string[]): Promise<Map<string, DeploymentFile>> {
const entry = map.get(h);
if (entry) {
if (entry.names[0] !== name) {
entry.names.push(name);
}
} else {
map.set(h, { names: [name], data, mode });
}
semaphore.release();
}
)
})
);
return map;
}
export default hashes;
export async function resolveNftJsonFiles(
hashedFiles: Map<string, DeploymentFile>
): Promise<string[]> {
const semaphore = new Sema(100);
const existingFiles = Array.from(hashedFiles.values());
const resolvedFiles = new Set<string>();
await Promise.all(
existingFiles.map(async file => {
await semaphore.acquire();
const fsPath = file.names[0];
if (fsPath.endsWith('.nft.json')) {
const json = file.data.toString('utf8');
const { version, files } = JSON.parse(json) as {
version: number;
files: string[] | { input: string; output: string }[];
};
if (version === 1 || version === 2) {
for (let f of files) {
const relPath = typeof f === 'string' ? f : f.input;
resolvedFiles.add(join(dirname(fsPath), relPath));
}
} else {
console.error(`Invalid nft.json version: ${version}`);
}
}
semaphore.release();
})
);
return Array.from(resolvedFiles);
}

View File

@@ -1,7 +1,7 @@
import { DeploymentFile } from './hashes';
import { FetchOptions } from '@zeit/fetch';
import { nodeFetch, zeitFetch } from './fetch';
import { join, sep, relative } from 'path';
import { join, sep, relative, posix } from 'path';
import { URL } from 'url';
import ignore from 'ignore';
type Ignore = ReturnType<typeof ignore>;
@@ -81,13 +81,16 @@ const maybeRead = async function <T>(path: string, default_: T) {
export async function buildFileTree(
path: string | string[],
isDirectory: boolean,
debug: Debug,
prebuilt?: boolean
{
isDirectory,
prebuilt,
rootDirectory,
}: Pick<VercelClientOptions, 'isDirectory' | 'prebuilt' | 'rootDirectory'>,
debug: Debug
): Promise<{ fileList: string[]; ignoreList: string[] }> {
const ignoreList: string[] = [];
let fileList: string[];
let { ig, ignores } = await getVercelIgnore(path, prebuilt);
let { ig, ignores } = await getVercelIgnore(path, prebuilt, rootDirectory);
debug(`Found ${ignores.length} rules in .vercelignore`);
debug('Building file tree...');
@@ -119,11 +122,23 @@ export async function buildFileTree(
export async function getVercelIgnore(
cwd: string | string[],
prebuilt?: boolean
prebuilt?: boolean,
rootDirectory?: string
): Promise<{ ig: Ignore; ignores: string[] }> {
const ignores: string[] = prebuilt
? ['*', '!.output', '!.output/**']
: [
let ignores: string[] = [];
const outputDir = posix.join(rootDirectory || '', '.output');
if (prebuilt) {
ignores.push('*');
const parts = outputDir.split('/');
parts.forEach((_, i) => {
const level = parts.slice(0, i + 1).join('/');
ignores.push(`!${level}`);
});
ignores.push(`!${outputDir}/**`);
} else {
ignores = [
'.hg',
'.git',
'.gitmodules',
@@ -148,8 +163,9 @@ export async function getVercelIgnore(
'__pycache__',
'venv',
'CVS',
'.output',
`.output`,
];
}
const cwds = Array.isArray(cwd) ? cwd : [cwd];
const files = await Promise.all(
@@ -250,12 +266,8 @@ export const prepareFiles = (
files: Map<string, DeploymentFile>,
clientOptions: VercelClientOptions
): PreparedFile[] => {
const preparedFiles = [...files.keys()].reduce(
(acc: PreparedFile[], sha: string): PreparedFile[] => {
const next = [...acc];
const file = files.get(sha) as DeploymentFile;
const preparedFiles: PreparedFile[] = [];
for (const [sha, file] of files) {
for (const name of file.names) {
let fileName: string;
@@ -271,18 +283,14 @@ export const prepareFiles = (
fileName = segments[segments.length - 1];
}
next.push({
preparedFiles.push({
file: isWin ? fileName.replace(/\\/g, '/') : fileName,
size: file.data.byteLength || file.data.length,
mode: file.mode,
sha,
});
}
return next;
},
[]
);
}
return preparedFiles;
};

View File

@@ -0,0 +1 @@
foo

View File

@@ -0,0 +1 @@
bar

View File

@@ -0,0 +1 @@
bar

View File

@@ -0,0 +1 @@
baz

View File

@@ -0,0 +1 @@
qux

View File

@@ -0,0 +1 @@
foo

View File

@@ -0,0 +1 @@
bar

View File

@@ -0,0 +1,4 @@
{
"extends": "../tsconfig.json",
"include": ["*.test.ts"]
}

View File

@@ -17,7 +17,11 @@ const toAbsolutePaths = (cwd: string, files: string[]) =>
describe('buildFileTree()', () => {
it('should exclude files using `.nowignore` blocklist', async () => {
const cwd = fixture('nowignore');
const { fileList, ignoreList } = await buildFileTree(cwd, true, noop);
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true },
noop
);
const expectedFileList = toAbsolutePaths(cwd, ['.nowignore', 'index.txt']);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
@@ -36,7 +40,11 @@ describe('buildFileTree()', () => {
it('should include the node_modules using `.vercelignore` allowlist', async () => {
const cwd = fixture('vercelignore-allow-nodemodules');
const { fileList, ignoreList } = await buildFileTree(cwd, true, noop);
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true },
noop
);
const expected = toAbsolutePaths(cwd, [
'node_modules/one.txt',
@@ -54,4 +62,90 @@ describe('buildFileTree()', () => {
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find root files but ignore .output files when prebuilt=false', async () => {
const cwd = fixture('file-system-api');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: false },
noop
);
const expectedFileList = toAbsolutePaths(cwd, ['foo.txt', 'sub/bar.txt']);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['.output'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find .output files but ignore other files when prebuilt=true', async () => {
const cwd = fixture('file-system-api');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: true },
noop
);
const expectedFileList = toAbsolutePaths(cwd, [
'.output/baz.txt',
'.output/sub/qux.txt',
]);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['foo.txt', 'sub'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find root files but ignore all .output files when prebuilt=false and rootDirectory=root', async () => {
const cwd = fixture('file-system-api-root-directory');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: false, rootDirectory: 'root' },
noop
);
const expectedFileList = toAbsolutePaths(cwd, [
'foo.txt',
'root/bar.txt',
'someother/bar.txt',
]);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['root/.output', 'someother/.output'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
it('should find root/.output files but ignore other files when prebuilt=true and rootDirectory=root', async () => {
const cwd = fixture('file-system-api-root-directory');
const { fileList, ignoreList } = await buildFileTree(
cwd,
{ isDirectory: true, prebuilt: true, rootDirectory: 'root' },
noop
);
const expectedFileList = toAbsolutePaths(cwd, [
'root/.output/baz.txt',
'root/.output/sub/qux.txt',
]);
expect(normalizeWindowsPaths(expectedFileList).sort()).toEqual(
normalizeWindowsPaths(fileList).sort()
);
const expectedIgnoreList = ['foo.txt', 'root/bar.txt', 'someother'];
expect(normalizeWindowsPaths(expectedIgnoreList).sort()).toEqual(
normalizeWindowsPaths(ignoreList).sort()
);
});
});

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/frameworks",
"version": "0.5.1-canary.16",
"version": "0.5.1-canary.17",
"main": "./dist/frameworks.js",
"types": "./dist/frameworks.d.ts",
"files": [

View File

@@ -141,7 +141,6 @@ export const frameworks = [
},
dependency: 'gatsby',
getOutputDirName: async () => 'public',
getFsOutputDir: async () => 'public',
defaultRoutes: async (dirPrefix: string) => {
// This file could be generated by gatsby-plugin-now or gatsby-plugin-zeit-now
try {
@@ -226,7 +225,6 @@ export const frameworks = [
},
},
dependency: 'remix',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultRoutes: [
{
@@ -254,10 +252,13 @@ export const frameworks = [
source: '/build/(.*)',
regex: '/build/(.*)',
headers: [
{ key: 'cache-control', value: 'public, max-age=31536000, immutable' },
{
key: 'cache-control',
value: 'public, max-age=31536000, immutable',
},
],
},
]
],
},
{
name: 'Hexo',
@@ -294,7 +295,6 @@ export const frameworks = [
},
},
dependency: 'hexo',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
},
{
@@ -332,7 +332,6 @@ export const frameworks = [
},
},
dependency: '@11ty/eleventy',
getFsOutputDir: async () => '_site',
getOutputDirName: async () => '_site',
cachePattern: '.cache/**',
},
@@ -372,22 +371,6 @@ export const frameworks = [
},
},
dependency: '@docusaurus/core',
getFsOutputDir: async (dirPrefix: string) => {
const base = 'build';
try {
const location = join(dirPrefix, base);
const content = await readdir(location, { withFileTypes: true });
// If there is only one file in it that is a dir we'll use it as dist dir
if (content.length === 1 && content[0].isDirectory()) {
return join(base, content[0].name);
}
} catch (error) {
console.error(`Error detecting output directory: `, error);
}
return base;
},
getOutputDirName: async (dirPrefix: string) => {
const base = 'build';
try {
@@ -527,21 +510,6 @@ export const frameworks = [
},
},
dependency: 'docusaurus',
getFsOutputDir: async (dirPrefix: string) => {
const base = 'build';
try {
const location = join(dirPrefix, base);
const content = await readdir(location, { withFileTypes: true });
// If there is only one file in it that is a dir we'll use it as dist dir
if (content.length === 1 && content[0].isDirectory()) {
return join(base, content[0].name);
}
} catch (error) {
console.error(`Error detecting output directory: `, error);
}
return base;
},
getOutputDirName: async (dirPrefix: string) => {
const base = 'build';
try {
@@ -593,7 +561,6 @@ export const frameworks = [
},
},
dependency: 'preact-cli',
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
defaultRoutes: [
{
@@ -650,7 +617,6 @@ export const frameworks = [
},
},
dependency: '@dojo/cli',
getFsOutputDir: async () => 'output/dist',
getOutputDirName: async () => join('output', 'dist'),
defaultRoutes: [
{
@@ -717,7 +683,6 @@ export const frameworks = [
},
},
dependency: 'ember-cli',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{
@@ -772,7 +737,6 @@ export const frameworks = [
},
},
dependency: '@vue/cli-service',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{
@@ -849,7 +813,6 @@ export const frameworks = [
},
},
dependency: '@scullyio/init',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist/static',
},
{
@@ -886,7 +849,6 @@ export const frameworks = [
},
},
dependency: '@ionic/angular',
getFsOutputDir: async () => 'www',
getOutputDirName: async () => 'www',
defaultRoutes: [
{
@@ -940,7 +902,6 @@ export const frameworks = [
},
},
dependency: '@angular/cli',
getFsOutputDir: async () => 'dist',
getOutputDirName: async (dirPrefix: string) => {
const base = 'dist';
try {
@@ -1008,7 +969,6 @@ export const frameworks = [
},
},
dependency: 'polymer-cli',
getFsOutputDir: async () => 'build',
getOutputDirName: async (dirPrefix: string) => {
const base = 'build';
try {
@@ -1078,7 +1038,6 @@ export const frameworks = [
},
},
dependency: 'sirv-cli',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultRoutes: [
{
@@ -1128,10 +1087,9 @@ export const frameworks = [
placeholder: 'svelte-kit dev',
},
outputDirectory: {
placeholder: 'public',
value: 'public',
},
},
getFsOutputDir: async () => '.output',
getOutputDirName: async () => 'public',
},
{
@@ -1168,7 +1126,6 @@ export const frameworks = [
},
},
dependency: '@ionic/react',
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
defaultRoutes: [
{
@@ -1276,7 +1233,6 @@ export const frameworks = [
},
},
dependency: 'react-scripts',
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
defaultRoutes: [
{
@@ -1378,7 +1334,6 @@ export const frameworks = [
},
},
dependency: 'gridsome',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
},
{
@@ -1416,7 +1371,6 @@ export const frameworks = [
},
},
dependency: 'umi',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{
@@ -1470,7 +1424,6 @@ export const frameworks = [
},
},
dependency: 'sapper',
getFsOutputDir: async () => '__sapper__/export',
getOutputDirName: async () => '__sapper__/export',
},
{
@@ -1508,7 +1461,6 @@ export const frameworks = [
},
},
dependency: 'saber',
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultRoutes: [
{
@@ -1577,7 +1529,6 @@ export const frameworks = [
},
},
dependency: '@stencil/core',
getFsOutputDir: async () => 'www',
getOutputDirName: async () => 'www',
defaultRoutes: [
{
@@ -1666,7 +1617,6 @@ export const frameworks = [
},
},
dependency: 'nuxt',
getFsOutputDir: async () => '.output',
getOutputDirName: async () => 'dist',
cachePattern: '.nuxt/**',
defaultRoutes: [
@@ -1724,7 +1674,6 @@ export const frameworks = [
placeholder: 'RedwoodJS default',
},
},
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
},
{
@@ -1768,16 +1717,6 @@ export const frameworks = [
placeholder: '`public` or `publishDir` from the `config` file',
},
},
getFsOutputDir: async (dirPrefix: string): Promise<string> => {
type HugoConfig = { publishDir?: string };
const config = await readConfigFile<HugoConfig>(
['config.json', 'config.yaml', 'config.toml'].map(fileName => {
return join(dirPrefix, fileName);
})
);
return (config && config.publishDir) || 'public';
},
getOutputDirName: async (dirPrefix: string): Promise<string> => {
type HugoConfig = { publishDir?: string };
const config = await readConfigFile<HugoConfig>(
@@ -1822,13 +1761,6 @@ export const frameworks = [
placeholder: '`_site` or `destination` from `_config.yml`',
},
},
getFsOutputDir: async (dirPrefix: string): Promise<string> => {
type JekyllConfig = { destination?: string };
const config = await readConfigFile<JekyllConfig>(
join(dirPrefix, '_config.yml')
);
return (config && config.destination) || '_site';
},
getOutputDirName: async (dirPrefix: string): Promise<string> => {
type JekyllConfig = { destination?: string };
const config = await readConfigFile<JekyllConfig>(
@@ -1870,7 +1802,6 @@ export const frameworks = [
value: 'public',
},
},
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
},
{
@@ -1905,7 +1836,6 @@ export const frameworks = [
value: 'build',
},
},
getFsOutputDir: async () => 'build',
getOutputDirName: async () => 'build',
cachePattern: '{vendor/bin,vendor/cache,vendor/bundle}/**',
},
@@ -1940,7 +1870,6 @@ export const frameworks = [
value: 'public',
},
},
getFsOutputDir: async () => 'public',
getOutputDirName: async () => 'public',
defaultVersion: '0.13.0',
},
@@ -1980,7 +1909,6 @@ export const frameworks = [
},
},
dependency: 'vite',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
},
{
@@ -2018,7 +1946,6 @@ export const frameworks = [
},
},
dependency: 'parcel',
getFsOutputDir: async () => 'dist',
getOutputDirName: async () => 'dist',
defaultRoutes: [
{

View File

@@ -162,9 +162,9 @@ export interface Framework {
dependency?: string;
/**
* Function that returns the name of the directory that the framework outputs
* its build results to. In some cases this is read from a configuration file.
* its File System API build results to, usually called `.output`.
*/
getFsOutputDir: (dirPrefix: string) => Promise<string>;
getFsOutputDir?: (dirPrefix: string) => Promise<string>;
/**
* Function that returns the name of the directory that the framework outputs
* its STATIC build results to. In some cases this is read from a configuration file.

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-middleware",
"version": "0.0.0-canary.7",
"version": "0.0.0-canary.19",
"license": "MIT",
"main": "./dist/index",
"homepage": "",
@@ -30,6 +30,7 @@
"@types/node-fetch": "^2",
"@types/ua-parser-js": "0.7.36",
"@types/uuid": "8.3.1",
"@vercel/build-utils": "2.12.3-canary.42",
"@vercel/ncc": "0.24.0",
"cookie": "0.4.1",
"formdata-node": "4.3.1",

View File

@@ -1,4 +1,4 @@
import * as middleware from './_middleware';
import * as middleware from './_temp_middleware';
_ENTRIES = typeof _ENTRIES === 'undefined' ? {} : _ENTRIES;
_ENTRIES['middleware_pages/_middleware'] = {
default: async function (ev) {

View File

@@ -0,0 +1,52 @@
import path from 'path';
import * as esbuild from 'esbuild';
const processInjectFile = `
// envOverride is passed by esbuild plugin
const env = envOverride
function cwd() {
return '/'
}
function chdir(dir) {
throw new Error('process.chdir is not supported')
}
export const process = {
argv: [],
env,
chdir,
cwd,
};
`;
export function nodeProcessPolyfillPlugin({ env = {} } = {}): esbuild.Plugin {
return {
name: 'node-process-polyfill',
setup({ initialOptions, onResolve, onLoad }) {
onResolve({ filter: /_virtual-process-polyfill_\.js/ }, ({ path }) => {
return {
path,
sideEffects: false,
};
});
onLoad({ filter: /_virtual-process-polyfill_\.js/ }, () => {
const contents = `const envOverride = ${JSON.stringify(
env
)};\n${processInjectFile}`;
return {
loader: 'js',
contents,
};
});
const polyfills = [
path.resolve(__dirname, '_virtual-process-polyfill_.js'),
];
if (initialOptions.inject) {
initialOptions.inject.push(...polyfills);
} else {
initialOptions.inject = [...polyfills];
}
},
};
}

View File

@@ -5,6 +5,7 @@ import { promises as fsp } from 'fs';
import { IncomingMessage, ServerResponse } from 'http';
import libGlob from 'glob';
import Proxy from 'http-proxy';
import { updateFunctionsManifest } from '@vercel/build-utils';
import { run } from './websandbox';
import type { FetchEventResult } from './websandbox/types';
@@ -16,13 +17,15 @@ import {
UrlWithParsedQuery,
} from 'url';
import { toNodeHeaders } from './websandbox/utils';
import { nodeProcessPolyfillPlugin } from './esbuild-plugins';
const glob = util.promisify(libGlob);
const SUPPORTED_EXTENSIONS = ['.js', '.ts'];
// File name of the `entries.js` file that gets copied into the
// project directory. Use a name that is unlikely to conflict.
const ENTRIES_NAME = '___vc_entries.js';
const TMP_ENTRIES_NAME = '.output/inputs/middleware/___vc_entries.js';
const TMP_MIDDLEWARE_BUNDLE = '.output/inputs/middleware/_temp_middleware.js';
async function getMiddlewareFile(workingDirectory: string) {
// Only the root-level `_middleware.*` files are considered.
@@ -52,17 +55,37 @@ async function getMiddlewareFile(workingDirectory: string) {
}
export async function build({ workPath }: { workPath: string }) {
const entriesPath = join(workPath, ENTRIES_NAME);
const entriesPath = join(workPath, TMP_ENTRIES_NAME);
const transientFilePath = join(workPath, TMP_MIDDLEWARE_BUNDLE);
const middlewareFile = await getMiddlewareFile(workPath);
if (!middlewareFile) return;
console.log('Compiling middleware file: %j', middlewareFile);
/**
* Two builds happen here, because esbuild doesn't offer a way to add a banner
* to individual input files, and the entries wrapper relies on running in
* non-strict mode to access the ENTRIES global.
*
* To work around this, we bundle the middleware directly and add
* 'use strict'; to make the entire bundle run in strict mode. We then bundle
* a second time, adding the global ENTRIES wrapper and preserving the
* 'use strict' for the entire scope of the original bundle.
*/
try {
await esbuild.build({
entryPoints: [middlewareFile],
bundle: true,
absWorkingDir: workPath,
outfile: transientFilePath,
banner: {
js: '"use strict";',
},
plugins: [nodeProcessPolyfillPlugin({ env: process.env })],
format: 'cjs',
});
// Create `_ENTRIES` wrapper
await fsp.copyFile(join(__dirname, 'entries.js'), entriesPath);
// Build
try {
await esbuild.build({
entryPoints: [entriesPath],
bundle: true,
@@ -70,29 +93,24 @@ export async function build({ workPath }: { workPath: string }) {
outfile: join(workPath, '.output/server/pages/_middleware.js'),
});
} finally {
await fsp.unlink(transientFilePath);
await fsp.unlink(entriesPath);
}
// Write middleware manifest
const middlewareManifest = {
version: 1,
sortedMiddleware: ['/'],
middleware: {
'/': {
const fileName = basename(middlewareFile);
const pages: { [key: string]: any } = {};
pages[fileName] = {
runtime: 'web',
env: [],
files: ['server/pages/_middleware.js'],
name: 'pages/_middleware',
page: '/',
regexp: '^/.*$',
},
},
sortingIndex: 1,
};
const middlewareManifestData = JSON.stringify(middlewareManifest, null, 2);
const middlewareManifestPath = join(
workPath,
'.output/server/middleware-manifest.json'
);
await fsp.writeFile(middlewareManifestPath, middlewareManifestData);
await updateFunctionsManifest({ workPath, pages });
}
const stringifyQuery = (req: IncomingMessage, query: ParsedUrlQuery) => {

View File

@@ -114,6 +114,7 @@ export async function run(params: {
const content = readFileSync(params.path, 'utf-8');
const esBuildResult = esbuild.transformSync(content, {
format: 'cjs',
banner: '"use strict";',
});
const x = vm.runInNewContext(m.wrap(esBuildResult.code), cache.sandbox, {
filename: params.path,
@@ -163,6 +164,7 @@ function sandboxRequire(referrer: string, specifier: string) {
const transformOptions: esbuild.TransformOptions = {
format: 'cjs',
banner: '"use strict";',
};
if (extname(resolved) === '.json') {
transformOptions.loader = 'json';

View File

@@ -2,8 +2,8 @@
exports[`build() should build simple middleware 1`] = `
Object {
"middleware": Object {
"/": Object {
"pages": Object {
"_middleware.js": Object {
"env": Array [],
"files": Array [
"server/pages/_middleware.js",
@@ -11,11 +11,10 @@ Object {
"name": "pages/_middleware",
"page": "/",
"regexp": "^/.*$",
"runtime": "web",
"sortingIndex": 1,
},
},
"sortedMiddleware": Array [
"/",
],
"version": 1,
"version": 2,
}
`;

View File

@@ -3,6 +3,30 @@ import { promises as fsp } from 'fs';
import { build } from '../src';
import { Response } from 'node-fetch';
const setupFixture = async (fixture: string) => {
const fixturePath = join(__dirname, `fixtures/${fixture}`);
await build({
workPath: fixturePath,
});
const functionsManifest = JSON.parse(
await fsp.readFile(
join(fixturePath, '.output/functions-manifest.json'),
'utf8'
)
);
const outputFile = join(fixturePath, '.output/server/pages/_middleware.js');
expect(await fsp.stat(outputFile)).toBeTruthy();
require(outputFile);
//@ts-ignore
const middleware = global._ENTRIES['middleware_pages/_middleware'].default;
return {
middleware,
functionsManifest,
};
};
describe('build()', () => {
beforeEach(() => {
//@ts-ignore
@@ -15,25 +39,9 @@ describe('build()', () => {
delete global._ENTRIES;
});
it('should build simple middleware', async () => {
const fixture = join(__dirname, 'fixtures/simple');
await build({
workPath: fixture,
});
const { functionsManifest, middleware } = await setupFixture('simple');
const middlewareManifest = JSON.parse(
await fsp.readFile(
join(fixture, '.output/server/middleware-manifest.json'),
'utf8'
)
);
expect(middlewareManifest).toMatchSnapshot();
const outputFile = join(fixture, '.output/server/pages/_middleware.js');
expect(await fsp.stat(outputFile)).toBeTruthy();
require(outputFile);
//@ts-ignore
const middleware = global._ENTRIES['middleware_pages/_middleware'].default;
expect(functionsManifest).toMatchSnapshot();
expect(typeof middleware).toStrictEqual('function');
const handledResponse = await middleware({
request: {
@@ -54,4 +62,38 @@ describe('build()', () => {
(unhandledResponse.response as Response).headers.get('x-middleware-next')
).toEqual('1');
});
it('should build simple middleware with env vars', async () => {
const expectedEnvVar = 'expected-env-var';
const fixture = join(__dirname, 'fixtures/env');
process.env.ENV_VAR_SHOULD_BE_DEFINED = expectedEnvVar;
await build({
workPath: fixture,
});
// env var should be inlined in the output
delete process.env.ENV_VAR_SHOULD_BE_DEFINED;
const outputFile = join(fixture, '.output/server/pages/_middleware.js');
expect(await fsp.stat(outputFile)).toBeTruthy();
require(outputFile);
//@ts-ignore
const middleware = global._ENTRIES['middleware_pages/_middleware'].default;
expect(typeof middleware).toStrictEqual('function');
const handledResponse = await middleware({
request: {},
});
expect(String(handledResponse.response.body)).toEqual(expectedEnvVar);
expect(
(handledResponse.response as Response).headers.get('x-middleware-next')
).toEqual(null);
});
it('should create a middleware that runs in strict mode', async () => {
const { middleware } = await setupFixture('use-strict');
const response = await middleware({
request: {},
});
expect(String(response.response.body)).toEqual('is strict mode? yes');
});
});

View File

@@ -0,0 +1,3 @@
export default req => {
return new Response(process.env.ENV_VAR_SHOULD_BE_DEFINED);
};

View File

@@ -0,0 +1,6 @@
export default function (req) {
const isStrict = (function () {
return !this;
})();
return new Response('is strict mode? ' + (isStrict ? 'yes' : 'no'));
}

View File

@@ -1,8 +1,8 @@
{
"private": false,
"name": "vercel-plugin-go",
"version": "1.0.0-canary.13",
"main": "dist/src/index.js",
"version": "1.0.0-canary.30",
"main": "dist/index.js",
"license": "MIT",
"files": [
"dist"
@@ -17,7 +17,7 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.28",
"@vercel/build-utils": "2.12.3-canary.42",
"@vercel/go": "1.2.4-canary.4"
},
"devDependencies": {

View File

@@ -1,7 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as go from '@vercel/go';
import { name } from '../package.json';
export const build = convertRuntimeToPlugin(go.build, name, '.go');
export const build = convertRuntimeToPlugin(go.build, 'vercel-plugin-go', '.go');
export const startDevServer = go.startDevServer;

View File

@@ -12,7 +12,6 @@
"noUnusedParameters": true,
"outDir": "dist",
"strict": true,
"target": "esnext",
"resolveJsonModule": true
"target": "esnext"
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-node",
"version": "1.12.2-canary.19",
"version": "1.12.2-canary.34",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/node-js",
@@ -34,12 +34,12 @@
"@types/node-fetch": "2",
"@types/test-listen": "1.1.0",
"@types/yazl": "2.4.2",
"@vercel/build-utils": "2.12.3-canary.28",
"@vercel/build-utils": "2.12.3-canary.42",
"@vercel/fun": "1.0.3",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.14.0",
"@vercel/node-bridge": "2.1.1-canary.2",
"@vercel/static-config": "0.0.1-canary.0",
"@vercel/static-config": "0.0.1-canary.1",
"abort-controller": "3.0.0",
"content-type": "1.0.4",
"cookie": "0.4.0",

View File

@@ -401,9 +401,9 @@ export async function build({ workPath }: { workPath: string }) {
getConfig(project, absEntrypoint, FunctionConfigSchema) || {};
// No config exported means "node", but if there is a config
// and "runtime" is defined, but it is not "node" then don't
// and "use" is defined, but it is not "node" then don't
// compile this file.
if (config.runtime && config.runtime !== 'node') {
if (config.use && config.use !== 'node') {
continue;
}

View File

@@ -1,8 +1,8 @@
{
"private": false,
"name": "vercel-plugin-python",
"version": "1.0.0-canary.14",
"main": "dist/src/index.js",
"version": "1.0.0-canary.31",
"main": "dist/index.js",
"license": "MIT",
"files": [
"dist"
@@ -17,8 +17,8 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.28",
"@vercel/python": "2.1.2-canary.1"
"@vercel/build-utils": "2.12.3-canary.42",
"@vercel/python": "2.1.2-canary.2"
},
"devDependencies": {
"@types/node": "*",

View File

@@ -1,7 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as python from '@vercel/python';
import { name } from '../package.json';
export const build = convertRuntimeToPlugin(python.build, name, '.py');
export const build = convertRuntimeToPlugin(python.build, 'vercel-plugin-python', '.py');
//export const startDevServer = python.startDevServer;

View File

@@ -13,7 +13,6 @@
"outDir": "dist",
"types": ["node"],
"strict": true,
"target": "esnext",
"resolveJsonModule": true
"target": "esnext"
}
}

View File

@@ -1,8 +1,8 @@
{
"private": false,
"name": "vercel-plugin-ruby",
"version": "1.0.0-canary.12",
"main": "dist/src/index.js",
"version": "1.0.0-canary.30",
"main": "dist/index.js",
"license": "MIT",
"files": [
"dist"
@@ -17,8 +17,8 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.28",
"@vercel/ruby": "1.2.8-canary.6"
"@vercel/build-utils": "2.12.3-canary.42",
"@vercel/ruby": "1.2.10-canary.0"
},
"devDependencies": {
"@types/node": "*",

View File

@@ -1,7 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as ruby from '@vercel/ruby';
import { name } from '../package.json';
export const build = convertRuntimeToPlugin(ruby.build, name, '.rb');
export const build = convertRuntimeToPlugin(ruby.build, 'vercel-plugin-ruby', '.rb');
//export const startDevServer = ruby.startDevServer;

View File

@@ -13,7 +13,6 @@
"outDir": "dist",
"types": ["node"],
"strict": true,
"target": "esnext",
"resolveJsonModule": true
"target": "esnext"
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/python",
"version": "2.1.2-canary.1",
"version": "2.1.2-canary.2",
"main": "./dist/index.js",
"license": "MIT",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/python",

View File

@@ -1,4 +1,3 @@
import { relative, basename } from 'path';
import execa from 'execa';
import { Meta, debug } from '@vercel/build-utils';
@@ -136,17 +135,10 @@ export async function installRequirementsFile({
meta,
args = [],
}: InstallRequirementsFileArg) {
const fileAtRoot = relative(workPath, filePath) === basename(filePath);
// If the `requirements.txt` file is located in the Root Directory of the project and
// the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed its dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall && fileAtRoot) {
debug(
`Skipping requirements file installation, already installed by Install Command`
);
return;
}
// The Vercel platform already handles `requirements.txt` for frontend projects,
// but the installation logic there is different, because it seems to install all
// of the dependencies globally, whereas, for this Runtime, we want it to happen only
// locally, so we'll run a separate installation.
if (
meta.isDev &&

View File

@@ -9,13 +9,24 @@ interface RubyVersion extends NodeVersion {
const allOptions: RubyVersion[] = [
{ major: 2, minor: 7, range: '2.7.x', runtime: 'ruby2.7' },
{ major: 2, minor: 5, range: '2.5.x', runtime: 'ruby2.5' },
{
major: 2,
minor: 5,
range: '2.5.x',
runtime: 'ruby2.5',
discontinueDate: new Date('2021-11-30'),
},
];
function getLatestRubyVersion(): RubyVersion {
return allOptions[0];
}
function isDiscontinued({ discontinueDate }: RubyVersion): boolean {
const today = Date.now();
return discontinueDate !== undefined && discontinueDate.getTime() <= today;
}
function getRubyPath(meta: Meta, gemfileContents: string) {
let selection = getLatestRubyVersion();
if (meta.isDev) {
@@ -37,8 +48,20 @@ function getRubyPath(meta: Meta, gemfileContents: string) {
if (!found) {
throw new NowBuildError({
code: 'RUBY_INVALID_VERSION',
message: 'Found `Gemfile` with invalid Ruby version: `' + line + '`.',
link: 'https://vercel.com/docs/runtimes#official-runtimes/ruby/ruby-version',
message: `Found \`Gemfile\` with invalid Ruby version: \`${line}.\``,
link: 'http://vercel.link/ruby-version',
});
}
if (isDiscontinued(selection)) {
const latest = getLatestRubyVersion();
const intro = `Found \`Gemfile\` with discontinued Ruby version: \`${line}.\``;
const hint = `Please set \`ruby "~> ${latest.range}"\` in your \`Gemfile\` to use Ruby ${latest.range}.`;
const upstream =
'This change is the result of a decision made by an upstream infrastructure provider (AWS).';
throw new NowBuildError({
code: 'RUBY_DISCONTINUED_VERSION',
link: 'http://vercel.link/ruby-version',
message: `${intro} ${hint} ${upstream}`,
});
}
}

View File

@@ -1,7 +1,7 @@
{
"name": "@vercel/ruby",
"author": "Nathan Cahill <nathan@nathancahill.com>",
"version": "1.2.8-canary.6",
"version": "1.2.10-canary.0",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/ruby",

View File

@@ -1,6 +1,5 @@
{
"version": 2,
"builds": [{ "src": "index.rb", "use": "@vercel/ruby" }],
"build": { "env": { "RUBY_VERSION": "2.7.x" } },
"probes": [{ "path": "/", "mustContain": "gem:RANDOMNESS_PLACEHOLDER" }]
}

View File

@@ -1,7 +1,6 @@
{
"version": 2,
"builds": [{ "src": "project/index.rb", "use": "@vercel/ruby" }],
"build": { "env": { "RUBY_VERSION": "2.7.x" } },
"probes": [
{ "path": "/project/", "mustContain": "gem:RANDOMNESS_PLACEHOLDER" }
]

View File

@@ -1,6 +1,5 @@
{
"version": 2,
"builds": [{ "src": "index.ru", "use": "@vercel/ruby" }],
"build": { "env": { "RUBY_VERSION": "2.7.x" } },
"probes": [{ "path": "/", "mustContain": "gem:RANDOMNESS_PLACEHOLDER" }]
}

View File

@@ -0,0 +1,7 @@
# frozen_string_literal: true
source "https://rubygems.org"
ruby "~> 2.5.x"
gem "cowsay", "~> 0.3.0"

View File

@@ -0,0 +1,16 @@
GEM
remote: https://rubygems.org/
specs:
cowsay (0.3.0)
PLATFORMS
x86_64-linux
DEPENDENCIES
cowsay (~> 0.3.0)
RUBY VERSION
ruby 2.5.5p157
BUNDLED WITH
2.2.22

View File

@@ -0,0 +1,9 @@
require 'cowsay'
Handler = Proc.new do |req, res|
name = req.query['name'] || 'World'
res.status = 200
res['Content-Type'] = 'text/text; charset=utf-8'
res.body = Cowsay.say("Hello #{name}", 'cow')
end

View File

@@ -0,0 +1,4 @@
{
"version": 2,
"builds": [{ "src": "index.rb", "use": "@vercel/ruby" }]
}

View File

@@ -23,8 +23,32 @@ beforeAll(async () => {
const fixturesPath = path.resolve(__dirname, 'fixtures');
const testsThatFailToBuild = new Map([
[
'11-version-2-5-error',
'Found `Gemfile` with discontinued Ruby version: `ruby "~> 2.5.x".` Please set `ruby "~> 2.7.x"` in your `Gemfile` to use Ruby 2.7.x. This change is the result of a decision made by an upstream infrastructure provider (AWS).',
],
]);
// eslint-disable-next-line no-restricted-syntax
for (const fixture of fs.readdirSync(fixturesPath)) {
const errMsg = testsThatFailToBuild.get(fixture);
if (errMsg) {
// eslint-disable-next-line no-loop-func
it(`should fail to build ${fixture}`, async () => {
try {
await testDeployment(
{ builderUrl, buildUtilsUrl },
path.join(fixturesPath, fixture)
);
} catch (err) {
expect(err).toBeTruthy();
expect(err.deployment).toBeTruthy();
expect(err.deployment.errorMessage).toBe(errMsg);
}
});
continue; //eslint-disable-line
}
// eslint-disable-next-line no-loop-func
it(`should build ${fixture}`, async () => {
await expect(

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/static-config",
"version": "0.0.1-canary.0",
"version": "0.0.1-canary.1",
"license": "MIT",
"main": "./dist/index",
"repository": {

View File

@@ -15,7 +15,7 @@ const ajv = new Ajv();
export const BaseFunctionConfigSchema = {
type: 'object',
properties: {
runtime: { type: 'string' },
use: { type: 'string' },
memory: { type: 'number' },
maxDuration: { type: 'number' },
regions: {

View File

@@ -2,7 +2,7 @@ import ms from 'https://denopkg.com/TooTallNate/ms';
import { readerFromStreamReader } from 'https://deno.land/std@0.107.0/io/streams.ts';
export const config = {
runtime: 'deno',
use: 'deno',
location: 'https://example.com/page',
};

View File

@@ -1,3 +1,3 @@
export const config = {
runtime: 0,
use: 0,
};

View File

@@ -1,7 +1,7 @@
import fs from 'fs';
export const config = {
runtime: 'node',
use: 'node',
memory: 1024,
};

View File

@@ -10,7 +10,7 @@ describe('getConfig()', () => {
expect(config).toMatchInlineSnapshot(`
Object {
"memory": 1024,
"runtime": "node",
"use": "node",
}
`);
});
@@ -27,7 +27,7 @@ describe('getConfig()', () => {
expect(config).toMatchInlineSnapshot(`
Object {
"location": "https://example.com/page",
"runtime": "deno",
"use": "deno",
}
`);
});