Compare commits

...

46 Commits

Author SHA1 Message Date
Leo Lamprecht
34f4222ca2 Publish Canary
- @vercel/build-utils@2.12.3-canary.34
 - vercel@23.1.3-canary.56
 - @vercel/client@10.2.3-canary.35
 - vercel-plugin-middleware@0.0.0-canary.10
 - vercel-plugin-go@1.0.0-canary.22
 - vercel-plugin-node@1.12.2-canary.26
 - vercel-plugin-python@1.0.0-canary.23
 - vercel-plugin-ruby@1.0.0-canary.21
2021-12-03 20:15:25 +01:00
Leo Lamprecht
5de045edd7 Use correct Lambda handler for CLI Plugins (#7128)
* Use correct Lambda handler for CLI Plugins

* Tweaked comment

* Fixed tests
2021-12-03 20:15:05 +01:00
Leo Lamprecht
5efd3b98de Publish Canary
- @vercel/build-utils@2.12.3-canary.33
 - vercel@23.1.3-canary.55
 - @vercel/client@10.2.3-canary.34
 - vercel-plugin-middleware@0.0.0-canary.9
 - vercel-plugin-go@1.0.0-canary.21
 - vercel-plugin-node@1.12.2-canary.25
 - vercel-plugin-python@1.0.0-canary.22
 - vercel-plugin-ruby@1.0.0-canary.20
2021-12-03 19:10:29 +01:00
Leo Lamprecht
82c83312c7 Fixed Legacy Runtime output generation (#7127)
* Corrected CLI Plugin output creation

* Wait for the linking to be done before removing anything

* Make it work

* Renamed variables

* Made everything work

* Removed debugging

* Fixed comment typos
2021-12-03 19:09:59 +01:00
Andy Bitz
5ccb983007 Publish Canary
- vercel@23.1.3-canary.54
 - vercel-plugin-middleware@0.0.0-canary.8
2021-12-03 12:12:03 +01:00
Steven
7a921399be [middleware] Fix dependencies (#7125) 2021-12-02 20:38:07 -05:00
Andy
3900f2f982 [cli] Support for next export with vercel build (#7122)
* [cli] Support `next export` in `vercel build`

* Add debug line

* Remove unused import

* Ensure file is copied

* Return in `getNextExportStatus`

* Include dotNextDir

Co-authored-by: Steven <steven@ceriously.com>
2021-12-02 20:31:17 -05:00
Steven
09939f1e07 Update github codeowners (#7123)
* Add gary and javi

* Remove timothy and coetry

* Remove rdev

* Add jaredpalmer

Co-authored-by: Nathan Rajlich <n@n8.io>
2021-12-02 19:20:18 -05:00
Leo Lamprecht
fc3a3ca81f Use Functions Manifest for Middleware (#7119)
### Related Issues

This fixes https://github.com/vercel/runtimes/issues/299.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-03 00:12:01 +00:00
Leo Lamprecht
ba7bf2e4a6 Publish Canary
- @vercel/build-utils@2.12.3-canary.32
 - vercel@23.1.3-canary.53
 - @vercel/client@10.2.3-canary.33
 - vercel-plugin-go@1.0.0-canary.20
 - vercel-plugin-node@1.12.2-canary.24
 - vercel-plugin-python@1.0.0-canary.21
 - vercel-plugin-ruby@1.0.0-canary.19
2021-12-02 23:51:51 +01:00
Leo Lamprecht
00641037fc Corrected input paths for CLI Plugins (#7121)
* Use correct paths for outputs

* Fixed tests

This reverts commit 7c4baeaafaf41609f47c97a09f5e9647fd8b89ee.

* Revert "Fixed tests"

This reverts commit 59c10d18c63f8404c3b0c361c3769b62524316f1.

* Revert "Use correct paths for outputs"

This reverts commit 23a0b34fad1e4932755a39975ae1dfa07acb2dd9.

* Corrected input paths for CLI Plugins

* Fixed tests

This reverts commit 7c4baeaafaf41609f47c97a09f5e9647fd8b89ee.

* Revert "Fixed tests"

This reverts commit 9612d2a9eb19240a5a4489406ada17a6a5bb3806.

* Fixed tests

* Delete vc__handler__python.py
2021-12-02 23:51:39 +01:00
Leo Lamprecht
6f4a1b527b Publish Canary
- @vercel/build-utils@2.12.3-canary.31
 - vercel@23.1.3-canary.52
 - @vercel/client@10.2.3-canary.32
 - vercel-plugin-go@1.0.0-canary.19
 - vercel-plugin-node@1.12.2-canary.23
 - vercel-plugin-python@1.0.0-canary.20
 - vercel-plugin-ruby@1.0.0-canary.18
2021-12-02 22:41:17 +01:00
Leo Lamprecht
1b95576dd2 Fixed Go, Ruby, and Python CLI Plugin output generation (#7117)
* Fixed error with API directory

* Made output work

* Use handler as API Route

* Correctly find the handler

* Fixed a missing instance

* Made handler logic work

* Made it work as expected

* Exclude unnecessary files

* Use a method that always works

* Additional comment

* Made everything work

* Cleaner tests

* Clean up all the useless files

* Fixed missing instance

* Speed up the code

* Removed useless lines

* Update packages/build-utils/src/convert-runtime-to-plugin.ts

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>

* Clarified comment

* Use relative logic again

* Fixed tests

* Deleted useless file

Co-authored-by: Andy <AndyBitz@users.noreply.github.com>
2021-12-02 22:18:43 +01:00
Leo Lamprecht
9227471aca Publish Canary
- @vercel/build-utils@2.12.3-canary.30
 - vercel@23.1.3-canary.51
 - @vercel/client@10.2.3-canary.31
 - vercel-plugin-go@1.0.0-canary.18
 - vercel-plugin-node@1.12.2-canary.22
 - vercel-plugin-python@1.0.0-canary.19
 - vercel-plugin-ruby@1.0.0-canary.17
2021-12-02 15:24:34 +01:00
Leo Lamprecht
bf060296eb Fixed typo (#7115) 2021-12-02 15:19:32 +01:00
Leo Lamprecht
9b3aa41f2e Publish Canary
- @vercel/build-utils@2.12.3-canary.29
 - vercel@23.1.3-canary.50
 - @vercel/client@10.2.3-canary.30
 - vercel-plugin-go@1.0.0-canary.17
 - vercel-plugin-node@1.12.2-canary.21
 - vercel-plugin-python@1.0.0-canary.18
 - vercel-plugin-ruby@1.0.0-canary.16
2021-12-02 15:02:00 +01:00
Leo Lamprecht
ae36585cdb Filter CLI Plugin output (#7114)
* Filter CLI Plugin output

* Only apply ignorefile check in the beginning
2021-12-02 15:01:17 +01:00
Leo Lamprecht
e4c636ddd2 Publish Canary
- vercel-plugin-go@1.0.0-canary.16
 - vercel-plugin-python@1.0.0-canary.17
 - vercel-plugin-ruby@1.0.0-canary.15
2021-12-02 14:14:14 +01:00
Leo Lamprecht
ae3b25be4b Made CLI Plugin publishing work (#7113) 2021-12-02 14:10:23 +01:00
Andy Bitz
a64ed13a40 Publish Canary
- vercel@23.1.3-canary.49
2021-12-02 12:55:04 +01:00
Andy
6c1c0e6676 [cli] Ignore required-server-files.json if it does not exist (#7111) 2021-12-02 12:54:30 +01:00
Leo Lamprecht
82fdd5d121 Publish Canary
- vercel@23.1.3-canary.48
 - vercel-plugin-go@1.0.0-canary.15
 - vercel-plugin-node@1.12.2-canary.20
 - vercel-plugin-python@1.0.0-canary.16
 - vercel-plugin-ruby@1.0.0-canary.14
 - @vercel/static-config@0.0.1-canary.1
2021-12-02 11:46:28 +01:00
Nathan Rajlich
8b40f4435e [api] Use the new File System API (#7108)
Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com>
2021-12-02 11:45:57 +01:00
Leo Lamprecht
38c87602bb Renamed runtime to use in static JS config (#7106)
### Related Issues

This applies what was mentioned in https://github.com/vercel/runtimes/issues/288#issuecomment-984101750.

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-02 00:37:09 +00:00
Nathan Rajlich
7aef3013e7 [cli] Use "127.0.0.1" instead of "localhost" in vc dev (#7094)
Node.js doesn't like when a hostname resolves to an IPv6 address (https://stackoverflow.com/a/15244890/376773) so use the IPv4 localhost IP address instead. Specifically this fixes vc dev on Node.js 17 which now prefers IPv6 by default.

Slack thread: https://vercel.slack.com/archives/C01A2M9R8RZ/p1638330248263400
2021-12-01 23:29:26 +00:00
Nathan Rajlich
c18676ab4d Publish Canary
- vercel-plugin-go@1.0.0-canary.14
 - vercel-plugin-python@1.0.0-canary.15
 - vercel-plugin-ruby@1.0.0-canary.13
2021-12-01 14:28:40 -08:00
Leo Lamprecht
df450c815d Properly publish CLI Plugins (#7103)
After https://github.com/vercel/vercel/pull/7088, `dist` now contains `package.json`, which made `tsc` move `index.js` a level deeper, effectively breaking the `main` property of all of the affected CLI Plugins.

This change makes them work again. 

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2021-12-01 20:59:46 +00:00
Andy Bitz
792ab38760 Publish Canary
- @vercel/build-utils@2.12.3-canary.28
 - vercel@23.1.3-canary.47
 - @vercel/client@10.2.3-canary.29
 - vercel-plugin-go@1.0.0-canary.13
 - vercel-plugin-node@1.12.2-canary.19
 - vercel-plugin-python@1.0.0-canary.14
 - vercel-plugin-ruby@1.0.0-canary.12
 - @vercel/ruby@1.2.8-canary.6
2021-12-01 15:32:05 +01:00
Leo Lamprecht
0bba3e76c1 Corrected dependency installation systems (#7088)
* Avoid unnecessary Gemfile installations

* Do not bundle `.output` or `.vercel` into Lambdas

* Use the same input hash format for all CLI Plugins and `vercel build`

* Fixed unit tests

* Fixed the unit tests again

* Fixed the unit tests

* Fixed all the tests

* Exclude useless files

* Consider `.vercelignore` and `.nowignore`

* Fixed error

* Reverted changes

* Deleted useless file

* Fixed tests

* Share input hash format with `vercel-plugin-node`

* Make output inspectable

* Fixed build error

* Extended comment

* Bump Ruby version

* Update Gemfiles

* Update bundles

* Fixed tests

Co-authored-by: Andy Bitz <artzbitz@gmail.com>
2021-12-01 15:31:27 +01:00
Andy Bitz
3d961ffbb9 Publish Canary
- vercel@23.1.3-canary.46
2021-11-30 19:25:13 +01:00
Andy
a3039f57bb [cli] Fix vc build with nested app dir (#7083)
* [cli] Fix `vc build` with nested app dir

* Set NEXT_PRIVATE_OUTPUT_TRACE_ROOT
2021-11-30 18:30:51 +01:00
Andy Bitz
5499fa9a04 Publish Canary
- @vercel/build-utils@2.12.3-canary.27
 - vercel@23.1.3-canary.45
 - @vercel/client@10.2.3-canary.28
 - vercel-plugin-go@1.0.0-canary.12
 - vercel-plugin-node@1.12.2-canary.18
 - vercel-plugin-python@1.0.0-canary.13
 - vercel-plugin-ruby@1.0.0-canary.11
2021-11-30 16:17:47 +01:00
Leo Lamprecht
b9fd64faff Stop passing functions and regions to Runtimes (#7085)
* Stop passing `functions` and `regions` to Runtimes

* Added required types back

* Removed `vercelConfig` from `vercel-plugin-node`

* Fixed unit test

* Fixed test
2021-11-30 16:16:53 +01:00
Andy Bitz
1202ff7b2b Publish Canary
- @vercel/build-utils@2.12.3-canary.26
 - vercel@23.1.3-canary.44
 - @vercel/client@10.2.3-canary.27
 - @vercel/frameworks@0.5.1-canary.16
 - vercel-plugin-go@1.0.0-canary.11
 - vercel-plugin-node@1.12.2-canary.17
 - vercel-plugin-python@1.0.0-canary.12
 - vercel-plugin-ruby@1.0.0-canary.10
 - @vercel/python@2.1.2-canary.1
 - @vercel/ruby@1.2.8-canary.5
2021-11-30 14:37:53 +01:00
Leo Lamprecht
abd9f019f1 Don't install Ruby or Python dependencies unnecessarily (#7084)
* Don't install Ruby or Python dependencies unnecessarily

* Less repeated code

* Cleaner code
2021-11-30 14:36:27 +01:00
Leo Lamprecht
edb5eead81 Speed up Remix Template (#7077)
* Replaced Remix Template

* Added npm lockfile for Remix Template

* Added npm lockfile for Next.js Template

* Added Remix logo
2021-11-29 16:01:23 +01:00
Andy Bitz
6b865ff753 Publish Canary
- @vercel/build-utils@2.12.3-canary.25
 - vercel@23.1.3-canary.43
 - @vercel/client@10.2.3-canary.26
 - @vercel/frameworks@0.5.1-canary.15
 - vercel-plugin-go@1.0.0-canary.10
 - vercel-plugin-node@1.12.2-canary.16
 - vercel-plugin-python@1.0.0-canary.11
 - vercel-plugin-ruby@1.0.0-canary.9
2021-11-29 14:26:12 +01:00
Andy
4fd0734c48 [cli] Consider envPrefix and outputDirectory for vercel build (#7069)
* [cli] Consider `envPrefix` for the framework

* Fix env

* Remove type

* Resolve .nft.json files correctly

* Fix public and static directory handling

* Do not use .replace

* Consider the output directory
2021-11-29 14:23:10 +01:00
Leo Lamprecht
f815421acb Renamed the Remix logo file (#7074) 2021-11-29 12:14:09 +01:00
Logan McAnsh
5da926fee1 chore: update remix logo (#7070) 2021-11-29 11:37:46 +01:00
Andy Bitz
3559531e4c Publish Canary
- @vercel/build-utils@2.12.3-canary.24
 - vercel@23.1.3-canary.42
 - @vercel/client@10.2.3-canary.25
 - @vercel/frameworks@0.5.1-canary.14
 - vercel-plugin-go@1.0.0-canary.9
 - vercel-plugin-node@1.12.2-canary.15
 - vercel-plugin-python@1.0.0-canary.10
 - vercel-plugin-ruby@1.0.0-canary.8
2021-11-25 12:01:09 +01:00
Leo Lamprecht
449a3b3648 Updated Remix template and ensured correct headers (#7064)
* Updated Remix Example

* Adjusted config as necessary

* Updated gitignore

* Fixed default Remix headers

* Removed useless files

* Fixed README

* Added newline
2021-11-25 11:57:57 +01:00
Steven
7bd338618c Publish Canary
- @vercel/build-utils@2.12.3-canary.23
 - vercel@23.1.3-canary.41
 - @vercel/client@10.2.3-canary.24
 - vercel-plugin-go@1.0.0-canary.8
 - vercel-plugin-node@1.12.2-canary.14
 - vercel-plugin-python@1.0.0-canary.9
 - vercel-plugin-ruby@1.0.0-canary.7
2021-11-24 22:03:54 -05:00
Steven
9048a6f584 [build-utils] Fix zero config routes for vercel build (#7063) 2021-11-24 22:00:49 -05:00
Steven
0cacb1bdac Publish Canary
- @vercel/build-utils@2.12.3-canary.22
 - vercel@23.1.3-canary.40
 - @vercel/client@10.2.3-canary.23
 - vercel-plugin-go@1.0.0-canary.7
 - vercel-plugin-node@1.12.2-canary.13
 - vercel-plugin-python@1.0.0-canary.8
 - vercel-plugin-ruby@1.0.0-canary.6
2021-11-24 18:12:26 -05:00
Steven
318bf35f82 [build-utils] Add support for writing routes-manifest.json (#7062)
* [build-utils] Add support for writing routes-manifest.json

* Add support for dynamicRoutes

* Add another test with multiple named params
2021-11-24 18:00:12 -05:00
143 changed files with 25468 additions and 3748 deletions

40
.github/CODEOWNERS vendored
View File

@@ -4,24 +4,26 @@
* @TooTallNate
/.github/workflows @AndyBitz @styfle
/packages/frameworks @AndyBitz
/packages/cli/src/commands/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/util/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/commands/domains @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/certs @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/env @styfle @lucleray
/packages/client @rdev @styfle @TooTallNate
/packages/build-utils @styfle @AndyBitz @TooTallNate
/packages/node @styfle @TooTallNate @lucleray
/packages/node-bridge @styfle @TooTallNate @lucleray
/packages/next @Timer @ijjk
/packages/go @styfle @TooTallNate
/packages/python @styfle @TooTallNate
/packages/ruby @styfle @coetry @TooTallNate
/packages/static-build @styfle @AndyBitz
/packages/routing-utils @styfle @dav-is @ijjk
/examples @mcsdevv @timothyis
/packages/cli/src/commands/build @TooTallNate @styfle @AndyBitz @gdborton @jaredpalmer
/packages/cli/src/commands/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/util/dev @TooTallNate @styfle @AndyBitz
/packages/cli/src/commands/domains @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/certs @javivelasco @mglagola @anatrajkovska
/packages/cli/src/commands/env @styfle @lucleray
/packages/client @styfle @TooTallNate
/packages/build-utils @styfle @AndyBitz @TooTallNate
/packages/middleware @gdborton @javivelasco
/packages/node @styfle @TooTallNate @lucleray
/packages/node-bridge @styfle @TooTallNate @lucleray
/packages/next @Timer @ijjk
/packages/go @styfle @TooTallNate
/packages/python @styfle @TooTallNate
/packages/ruby @styfle @TooTallNate
/packages/static-build @styfle @AndyBitz
/packages/routing-utils @styfle @dav-is @ijjk
/examples @mcsdevv
/examples/create-react-app @Timer
/examples/nextjs @timneutkens @Timer
/examples/hugo @mcsdevv @timothyis @styfle
/examples/jekyll @mcsdevv @timothyis @styfle
/examples/zola @mcsdevv @timothyis @styfle
/examples/hugo @mcsdevv @styfle
/examples/jekyll @mcsdevv @styfle
/examples/zola @mcsdevv @styfle

View File

@@ -5,7 +5,7 @@
"description": "API for the vercel/vercel repo",
"main": "index.js",
"scripts": {
"vercel-build": "yarn --cwd .. && node ../utils/run.js build all"
"vercel-build": "node ../utils/run.js build all"
},
"dependencies": {
"@sentry/node": "5.11.1",

15787
examples/nextjs/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,8 +1,8 @@
node_modules
/.cache
/.vercel
/.output
.cache
.vercel
.output
/public/build
/api/build
public/build
api/_build

View File

@@ -2,56 +2,33 @@
- [Remix Docs](https://remix.run/docs)
## Vercel Setup
## Deployment
First you'll need the [Vercel CLI](https://vercel.com/docs/cli):
After having run the `create-remix` command and selected "Vercel" as a deployment target, you only need to [import your Git repository](https://vercel.com/new) into Vercel, and it will be deployed.
If you'd like to avoid using a Git repository, you can also deploy the directory by running [Vercel CLI](https://vercel.com/cli):
```sh
npm i -g vercel
vercel
```
Before you can run the app in development, you need to link this project to a new Vercel project on your account.
**It is important that you use a new project. If you try to link this project to an existing project (like a Next.js site) you will have problems.**
```sh
$ vercel link
```
Follow the prompts, and when it's done you should be able to get started.
It is generally recommended to use a Git repository, because future commits will then automatically be deployed by Vercel, through its [Git Integration](https://vercel.com/docs/concepts/git).
## Development
You will be running two processes during development when using Vercel as your server.
- Your Vercel server in one
- The Remix development server in another
To run your Remix app locally, make sure your project's local dependencies are installed:
```sh
# in one tab
$ vercel dev
npm install
```
# in another
$ npm run dev
Afterwards, start the Remix development server like so:
```sh
npm run dev
```
Open up [http://localhost:3000](http://localhost:3000) and you should be ready to go!
If you'd rather run everything in a single tab, you can look at [concurrently](https://npm.im/concurrently) or similar tools to run both processes in one tab.
## Deploying
```sh
$ npm run build
# preview deployment
$ vercel
# production deployment
$ vercel --prod
```
### GitHub Automatic Deployments
For some reason the GitHub integration doesn't deploy the public folder. We're working with Vercel to figure this out.
For now, [you can set up a GitHub action with this config](https://gist.github.com/mcansh/91f8effda798b41bb373351fad217070) from our friend [@mcansh](https://github.com/mcansh).
If you're used to using the `vercel dev` command provided by [Vercel CLI](https://vercel.com/cli) instead, you can also use that, but it's not needed.

View File

@@ -1,5 +1,5 @@
const { createRequestHandler } = require("@remix-run/vercel");
module.exports = createRequestHandler({
build: require("./build")
build: require("./_build")
});

View File

@@ -1,4 +1,3 @@
import * as React from "react";
import {
Link,
Links,
@@ -7,23 +6,14 @@ import {
Outlet,
Scripts,
ScrollRestoration,
useCatch,
useLocation
useCatch
} from "remix";
import type { LinksFunction } from "remix";
import deleteMeRemixStyles from "~/styles/demos/remix.css";
import globalStylesUrl from "~/styles/global.css";
import darkStylesUrl from "~/styles/dark.css";
/**
* The `links` export is a function that returns an array of objects that map to
* the attributes for an HTML `<link>` element. These will load `<link>` tags on
* every route in the app, but individual routes can include their own links
* that are automatically unloaded when a user navigates away from the route.
*
* https://remix.run/api/app#links
*/
// https://remix.run/api/app#links
export let links: LinksFunction = () => {
return [
{ rel: "stylesheet", href: globalStylesUrl },
@@ -31,16 +21,12 @@ export let links: LinksFunction = () => {
rel: "stylesheet",
href: darkStylesUrl,
media: "(prefers-color-scheme: dark)"
},
{ rel: "stylesheet", href: deleteMeRemixStyles }
}
];
};
/**
* The root module's default export is a component that renders the current
* route via the `<Outlet />` component. Think of this as the global layout
* component for your app.
*/
// https://remix.run/api/conventions#default-export
// https://remix.run/api/conventions#route-filenames
export default function App() {
return (
<Document>
@@ -51,68 +37,27 @@ export default function App() {
);
}
function Document({
children,
title
}: {
children: React.ReactNode;
title?: string;
}) {
// https://remix.run/docs/en/v1/api/conventions#errorboundary
export function ErrorBoundary({ error }: { error: Error }) {
console.error(error);
return (
<html lang="en">
<head>
<meta charSet="utf-8" />
<meta name="viewport" content="width=device-width,initial-scale=1" />
{title ? <title>{title}</title> : null}
<Meta />
<Links />
</head>
<body>
{children}
<RouteChangeAnnouncement />
<ScrollRestoration />
<Scripts />
{process.env.NODE_ENV === "development" && <LiveReload />}
</body>
</html>
);
}
function Layout({ children }: React.PropsWithChildren<{}>) {
return (
<div className="remix-app">
<header className="remix-app__header">
<div className="container remix-app__header-content">
<Link to="/" title="Remix" className="remix-app__header-home-link">
<RemixLogo />
</Link>
<nav aria-label="Main navigation" className="remix-app__header-nav">
<ul>
<li>
<Link to="/">Home</Link>
</li>
<li>
<a href="https://remix.run/docs">Remix Docs</a>
</li>
<li>
<a href="https://github.com/remix-run/remix">GitHub</a>
</li>
</ul>
</nav>
</div>
</header>
<div className="remix-app__main">
<div className="container remix-app__main-content">{children}</div>
</div>
<footer className="remix-app__footer">
<div className="container remix-app__footer-content">
<p>&copy; You!</p>
</div>
</footer>
</div>
<Document title="Error!">
<Layout>
<div>
<h1>There was an error</h1>
<p>{error.message}</p>
<hr />
<p>
Hey, developer, you should replace this with what you want your
users to see.
</p>
</div>
</Layout>
</Document>
);
}
// https://remix.run/docs/en/v1/api/conventions#catchboundary
export function CatchBoundary() {
let caught = useCatch();
@@ -148,26 +93,68 @@ export function CatchBoundary() {
);
}
export function ErrorBoundary({ error }: { error: Error }) {
console.error(error);
function Document({
children,
title
}: {
children: React.ReactNode;
title?: string;
}) {
return (
<Document title="Error!">
<Layout>
<div>
<h1>There was an error</h1>
<p>{error.message}</p>
<hr />
<p>
Hey, developer, you should replace this with what you want your
users to see.
</p>
</div>
</Layout>
</Document>
<html lang="en">
<head>
<meta charSet="utf-8" />
<meta name="viewport" content="width=device-width,initial-scale=1" />
{title ? <title>{title}</title> : null}
<Meta />
<Links />
</head>
<body>
{children}
<ScrollRestoration />
<Scripts />
{process.env.NODE_ENV === "development" && <LiveReload />}
</body>
</html>
);
}
function RemixLogo(props: React.ComponentPropsWithoutRef<"svg">) {
function Layout({ children }: { children: React.ReactNode }) {
return (
<div className="remix-app">
<header className="remix-app__header">
<div className="container remix-app__header-content">
<Link to="/" title="Remix" className="remix-app__header-home-link">
<RemixLogo />
</Link>
<nav aria-label="Main navigation" className="remix-app__header-nav">
<ul>
<li>
<Link to="/">Home</Link>
</li>
<li>
<a href="https://remix.run/docs">Remix Docs</a>
</li>
<li>
<a href="https://github.com/remix-run/remix">GitHub</a>
</li>
</ul>
</nav>
</div>
</header>
<div className="remix-app__main">
<div className="container remix-app__main-content">{children}</div>
</div>
<footer className="remix-app__footer">
<div className="container remix-app__footer-content">
<p>&copy; You!</p>
</div>
</footer>
</div>
);
}
function RemixLogo() {
return (
<svg
viewBox="0 0 659 165"
@@ -179,7 +166,6 @@ function RemixLogo(props: React.ComponentPropsWithoutRef<"svg">) {
width="106"
height="30"
fill="currentColor"
{...props}
>
<title id="remix-run-logo-title">Remix Logo</title>
<path d="M0 161V136H45.5416C53.1486 136 54.8003 141.638 54.8003 145V161H0Z M133.85 124.16C135.3 142.762 135.3 151.482 135.3 161H92.2283C92.2283 158.927 92.2653 157.03 92.3028 155.107C92.4195 149.128 92.5411 142.894 91.5717 130.304C90.2905 111.872 82.3473 107.776 67.7419 107.776H54.8021H0V74.24H69.7918C88.2407 74.24 97.4651 68.632 97.4651 53.784C97.4651 40.728 88.2407 32.816 69.7918 32.816H0V0H77.4788C119.245 0 140 19.712 140 51.2C140 74.752 125.395 90.112 105.665 92.672C122.32 96 132.057 105.472 133.85 124.16Z" />
@@ -190,58 +176,3 @@ function RemixLogo(props: React.ComponentPropsWithoutRef<"svg">) {
</svg>
);
}
/**
* Provides an alert for screen reader users when the route changes.
*/
const RouteChangeAnnouncement = React.memo(() => {
let [hydrated, setHydrated] = React.useState(false);
let [innerHtml, setInnerHtml] = React.useState("");
let location = useLocation();
React.useEffect(() => {
setHydrated(true);
}, []);
let firstRenderRef = React.useRef(true);
React.useEffect(() => {
// Skip the first render because we don't want an announcement on the
// initial page load.
if (firstRenderRef.current) {
firstRenderRef.current = false;
return;
}
let pageTitle = location.pathname === "/" ? "Home page" : document.title;
setInnerHtml(`Navigated to ${pageTitle}`);
}, [location.pathname]);
// Render nothing on the server. The live region provides no value unless
// scripts are loaded and the browser takes over normal routing.
if (!hydrated) {
return null;
}
return (
<div
aria-live="assertive"
aria-atomic
id="route-change-region"
style={{
border: "0",
clipPath: "inset(100%)",
clip: "rect(0 0 0 0)",
height: "1px",
margin: "-1px",
overflow: "hidden",
padding: "0",
position: "absolute",
width: "1px",
whiteSpace: "nowrap",
wordWrap: "normal"
}}
>
{innerHtml}
</div>
);
});

View File

@@ -1,120 +0,0 @@
/*
* You probably want to just delete this file; it's just for the demo pages.
*/
.remix-app {
display: flex;
flex-direction: column;
min-height: 100vh;
min-height: calc(100vh - env(safe-area-inset-bottom));
}
.remix-app > * {
width: 100%;
}
.remix-app__header {
padding-top: 1rem;
padding-bottom: 1rem;
border-bottom: 1px solid var(--color-border);
}
.remix-app__header-content {
display: flex;
justify-content: space-between;
align-items: center;
}
.remix-app__header-home-link {
width: 106px;
height: 30px;
color: var(--color-foreground);
}
.remix-app__header-nav ul {
list-style: none;
margin: 0;
display: flex;
align-items: center;
gap: 1.5em;
}
.remix-app__header-nav li {
font-weight: bold;
}
.remix-app__main {
flex: 1 1 100%;
}
.remix-app__footer {
padding-top: 1rem;
padding-bottom: 1rem;
border-top: 1px solid var(--color-border);
}
.remix-app__footer-content {
display: flex;
justify-content: center;
align-items: center;
}
.remix__page {
--gap: 1rem;
--space: 2rem;
display: grid;
grid-auto-rows: min-content;
gap: var(--gap);
padding-top: var(--space);
padding-bottom: var(--space);
}
@media print, screen and (min-width: 640px) {
.remix__page {
--gap: 2rem;
grid-auto-rows: unset;
grid-template-columns: repeat(2, 1fr);
}
}
@media screen and (min-width: 1024px) {
.remix__page {
--gap: 4rem;
}
}
.remix__page > main > :first-child {
margin-top: 0;
}
.remix__page > main > :last-child {
margin-bottom: 0;
}
.remix__page > aside {
margin: 0;
padding: 1.5ch 2ch;
border: solid 1px var(--color-border);
border-radius: 0.5rem;
}
.remix__page > aside > :first-child {
margin-top: 0;
}
.remix__page > aside > :last-child {
margin-bottom: 0;
}
.remix__form {
display: flex;
flex-direction: column;
gap: 1rem;
padding: 1rem;
border: 1px solid var(--color-border);
border-radius: 0.5rem;
}
.remix__form > * {
margin-top: 0;
margin-bottom: 0;
}

View File

@@ -96,3 +96,121 @@ input:where([type="search"]) {
margin-right: auto;
margin-left: auto;
}
.remix-app {
display: flex;
flex-direction: column;
min-height: 100vh;
min-height: calc(100vh - env(safe-area-inset-bottom));
}
.remix-app > * {
width: 100%;
}
.remix-app__header {
padding-top: 1rem;
padding-bottom: 1rem;
border-bottom: 1px solid var(--color-border);
}
.remix-app__header-content {
display: flex;
justify-content: space-between;
align-items: center;
}
.remix-app__header-home-link {
width: 106px;
height: 30px;
color: var(--color-foreground);
}
.remix-app__header-nav ul {
list-style: none;
margin: 0;
display: flex;
align-items: center;
gap: 1.5em;
}
.remix-app__header-nav li {
font-weight: bold;
}
.remix-app__main {
flex: 1 1 100%;
}
.remix-app__footer {
padding-top: 1rem;
padding-bottom: 1rem;
border-top: 1px solid var(--color-border);
}
.remix-app__footer-content {
display: flex;
justify-content: center;
align-items: center;
}
.remix__page {
--gap: 1rem;
--space: 2rem;
display: grid;
grid-auto-rows: min-content;
gap: var(--gap);
padding-top: var(--space);
padding-bottom: var(--space);
}
@media print, screen and (min-width: 640px) {
.remix__page {
--gap: 2rem;
grid-auto-rows: unset;
grid-template-columns: repeat(2, 1fr);
}
}
@media screen and (min-width: 1024px) {
.remix__page {
--gap: 4rem;
}
}
.remix__page > main > :first-child {
margin-top: 0;
}
.remix__page > main > :last-child {
margin-bottom: 0;
}
.remix__page > aside {
margin: 0;
padding: 1.5ch 2ch;
border: solid 1px var(--color-border);
border-radius: 0.5rem;
}
.remix__page > aside > :first-child {
margin-top: 0;
}
.remix__page > aside > :last-child {
margin-bottom: 0;
}
.remix__form {
display: flex;
flex-direction: column;
gap: 1rem;
padding: 1rem;
border: 1px solid var(--color-border);
border-radius: 0.5rem;
}
.remix__form > * {
margin-top: 0;
margin-bottom: 0;
}

8345
examples/remix/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -9,14 +9,15 @@
"postinstall": "remix setup node"
},
"dependencies": {
"@remix-run/react": "^1.0.4",
"@remix-run/react": "^1.0.6",
"react": "^17.0.2",
"react-dom": "^17.0.2",
"remix": "^1.0.4",
"@remix-run/vercel": "^1.0.4"
"remix": "^1.0.6",
"@remix-run/serve": "^1.0.6",
"@remix-run/vercel": "^1.0.6"
},
"devDependencies": {
"@remix-run/dev": "^1.0.4",
"@remix-run/dev": "^1.0.6",
"@types/react": "^17.0.24",
"@types/react-dom": "^17.0.9",
"typescript": "^4.1.2"

View File

@@ -5,5 +5,5 @@ module.exports = {
appDirectory: "app",
browserBuildDirectory: "public/build",
publicPath: "/build/",
serverBuildDirectory: "api/build"
serverBuildDirectory: "api/_build"
};

View File

@@ -1,7 +1,7 @@
{
"build": {
"env": {
"ENABLE_FILE_SYSTEM_API": "1"
}
"build": {
"env": {
"ENABLE_FILE_SYSTEM_API": "1"
}
}
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/build-utils",
"version": "2.12.3-canary.21",
"version": "2.12.3-canary.34",
"license": "MIT",
"main": "./dist/index.js",
"types": "./dist/index.d.js",
@@ -30,7 +30,7 @@
"@types/node-fetch": "^2.1.6",
"@types/semver": "6.0.0",
"@types/yazl": "^2.4.1",
"@vercel/frameworks": "0.5.1-canary.13",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/ncc": "0.24.0",
"aggregate-error": "3.0.1",
"async-retry": "1.2.3",

View File

@@ -1,90 +1,281 @@
import fs from 'fs-extra';
import { join, dirname, relative } from 'path';
import { join, dirname, parse, relative } from 'path';
import glob from './fs/glob';
import { normalizePath } from './fs/normalize-path';
import { FILES_SYMBOL, getLambdaOptionsFromFunction, Lambda } from './lambda';
import { FILES_SYMBOL, Lambda } from './lambda';
import type FileBlob from './file-blob';
import type { BuilderFunctions, BuildOptions, Files } from './types';
import minimatch from 'minimatch';
import type { BuildOptions, Files } from './types';
import { debug, getIgnoreFilter } from '.';
// `.output` was already created by the Build Command, so we have
// to ensure its contents don't get bundled into the Lambda. Similarily,
// we don't want to bundle anything from `.vercel` either. Lastly,
// Builders/Runtimes didn't have `vercel.json` or `now.json`.
const ignoredPaths = ['.output', '.vercel', 'vercel.json', 'now.json'];
const shouldIgnorePath = (
file: string,
ignoreFilter: any,
ignoreFile: boolean
) => {
const isNative = ignoredPaths.some(item => {
return file.startsWith(item);
});
if (!ignoreFile) {
return isNative;
}
return isNative || ignoreFilter(file);
};
const getSourceFiles = async (workPath: string, ignoreFilter: any) => {
const list = await glob('**', {
cwd: workPath,
});
// We're not passing this as an `ignore` filter to the `glob` function above,
// so that we can re-use exactly the same `getIgnoreFilter` method that the
// Build Step uses (literally the same code). Note that this exclusion only applies
// when deploying. Locally, another exclusion is needed, which is handled
// further below in the `convertRuntimeToPlugin` function.
for (const file in list) {
if (shouldIgnorePath(file, ignoreFilter, true)) {
delete list[file];
}
}
return list;
};
/**
* Convert legacy Runtime to a Plugin.
* @param buildRuntime - a legacy build() function from a Runtime
* @param packageName - the name of the package, for example `vercel-plugin-python`
* @param ext - the file extension, for example `.py`
*/
export function convertRuntimeToPlugin(
buildRuntime: (options: BuildOptions) => Promise<{ output: Lambda }>,
packageName: string,
ext: string
) {
// This `build()` signature should match `plugin.build()` signature in `vercel build`.
return async function build({
vercelConfig,
workPath,
}: {
vercelConfig: { functions?: BuilderFunctions; regions?: string[] };
workPath: string;
}) {
const opts = { cwd: workPath };
const files = await glob('**', opts);
delete files['vercel.json']; // Builders/Runtimes didn't have vercel.json
const entrypoints = await glob(`api/**/*${ext}`, opts);
return async function build({ workPath }: { workPath: string }) {
// We also don't want to provide any files to Runtimes that were ignored
// through `.vercelignore` or `.nowignore`, because the Build Step does the same.
const ignoreFilter = await getIgnoreFilter(workPath);
// Retrieve the files that are currently available on the File System,
// before the Legacy Runtime has even started to build.
const sourceFilesPreBuild = await getSourceFiles(workPath, ignoreFilter);
// Instead of doing another `glob` to get all the matching source files,
// we'll filter the list of existing files down to only the ones
// that are matching the entrypoint pattern, so we're first creating
// a clean new list to begin.
const entrypoints = Object.assign({}, sourceFilesPreBuild);
const entrypointMatch = new RegExp(`^api/.*${ext}$`);
// Up next, we'll strip out the files from the list of entrypoints
// that aren't actually considered entrypoints.
for (const file in entrypoints) {
if (!entrypointMatch.test(file)) {
delete entrypoints[file];
}
}
const pages: { [key: string]: any } = {};
const { functions = {} } = vercelConfig;
const traceDir = join(workPath, '.output', 'runtime-traced-files');
const pluginName = packageName.replace('vercel-plugin-', '');
const traceDir = join(
workPath,
`.output`,
`inputs`,
// Legacy Runtimes can only provide API Routes, so that's
// why we can use this prefix for all of them. Here, we have to
// make sure to not use a cryptic hash name, because people
// need to be able to easily inspect the output.
`api-routes-${pluginName}`
);
await fs.ensureDir(traceDir);
for (const entrypoint of Object.keys(entrypoints)) {
const key =
Object.keys(functions).find(
src => src === entrypoint || minimatch(entrypoint, src)
) || '';
const config = functions[key] || {};
let newPathsRuntime: Set<string> = new Set();
let linkersRuntime: Array<Promise<void>> = [];
for (const entrypoint of Object.keys(entrypoints)) {
const { output } = await buildRuntime({
files,
files: sourceFilesPreBuild,
entrypoint,
workPath,
config: {
zeroConfig: true,
includeFiles: config.includeFiles,
excludeFiles: config.excludeFiles,
},
meta: {
avoidTopLevelInstall: true,
},
});
pages[entrypoint] = {
handler: output.handler,
runtime: output.runtime,
memory: output.memory,
maxDuration: output.maxDuration,
environment: output.environment,
allowQuery: output.allowQuery,
regions: output.regions,
};
// Legacy Runtimes tend to pollute the `workPath` with compiled results,
// because the `workPath` used to be a place that was a place where they could
// just put anything, but nowadays it's the working directory of the `vercel build`
// command, which is the place where the developer keeps their source files,
// so we don't want to pollute this space unnecessarily. That means we have to clean
// up files that were created by the build, which is done further below.
const sourceFilesAfterBuild = await getSourceFiles(
workPath,
ignoreFilter
);
// Further down, we will need the filename of the Lambda handler
// for placing it inside `server/pages/api`, but because Legacy Runtimes
// don't expose the filename directly, we have to construct it
// from the handler name, and then find the matching file further below,
// because we don't yet know its extension here.
const handler = output.handler;
const handlerMethod = handler.split('.').reverse()[0];
const handlerFileName = handler.replace(`.${handlerMethod}`, '');
// @ts-ignore This symbol is a private API
const lambdaFiles: Files = output[FILES_SYMBOL];
// When deploying, the `files` that are passed to the Legacy Runtimes already
// have certain files that are ignored stripped, but locally, that list of
// files isn't used by the Legacy Runtimes, so we need to apply the filters
// to the outputs that they are returning instead.
for (const file in lambdaFiles) {
if (shouldIgnorePath(file, ignoreFilter, false)) {
delete lambdaFiles[file];
}
}
const handlerFilePath = Object.keys(lambdaFiles).find(item => {
return parse(item).name === handlerFileName;
});
const handlerFileOrigin = lambdaFiles[handlerFilePath || ''].fsPath;
if (!handlerFileOrigin) {
throw new Error(
`Could not find a handler file. Please ensure that the list of \`files\` defined for the returned \`Lambda\` contains a file with the name ${handlerFileName} (+ any extension).`
);
}
const entry = join(workPath, '.output', 'server', 'pages', entrypoint);
// We never want to link here, only copy, because the launcher
// file often has the same name for every entrypoint, which means that
// every build for every entrypoint overwrites the launcher of the previous
// one, so linking would end with a broken reference.
await fs.ensureDir(dirname(entry));
await linkOrCopy(files[entrypoint].fsPath, entry);
await fs.copy(handlerFileOrigin, entry);
const newFilesEntrypoint: Array<string> = [];
const newDirectoriesEntrypoint: Array<string> = [];
const preBuildFiles = Object.values(sourceFilesPreBuild).map(file => {
return file.fsPath;
});
// Generate a list of directories and files that weren't present
// before the entrypoint was processed by the Legacy Runtime, so
// that we can perform a cleanup later. We need to divide into files
// and directories because only cleaning up files might leave empty
// directories, and listing directories separately also speeds up the
// build because we can just delete them, which wipes all of their nested
// paths, instead of iterating through all files that should be deleted.
for (const file in sourceFilesAfterBuild) {
if (!sourceFilesPreBuild[file]) {
const path = sourceFilesAfterBuild[file].fsPath;
const dirPath = dirname(path);
// If none of the files that were present before the entrypoint
// was processed are contained within the directory we're looking
// at right now, then we know it's a newly added directory
// and it can therefore be removed later on.
const isNewDir = !preBuildFiles.some(filePath => {
return dirname(filePath).startsWith(dirPath);
});
// Check out the list of tracked directories that were
// newly added and see if one of them contains the path
// we're looking at.
const hasParentDir = newDirectoriesEntrypoint.some(dir => {
return path.startsWith(dir);
});
// If we have already tracked a directory that was newly
// added that sits above the file or directory that we're
// looking at, we don't need to add more entries to the list
// because when the parent will get removed in the future,
// all of its children (and therefore the path we're looking at)
// will automatically get removed anyways.
if (hasParentDir) {
continue;
}
if (isNewDir) {
newDirectoriesEntrypoint.push(dirPath);
} else {
newFilesEntrypoint.push(path);
}
}
}
const tracedFiles: {
absolutePath: string;
relativePath: string;
}[] = [];
Object.entries(lambdaFiles).forEach(async ([relPath, file]) => {
const newPath = join(traceDir, relPath);
tracedFiles.push({ absolutePath: newPath, relativePath: relPath });
if (file.fsPath) {
await linkOrCopy(file.fsPath, newPath);
} else if (file.type === 'FileBlob') {
const { data, mode } = file as FileBlob;
await fs.writeFile(newPath, data, { mode });
} else {
throw new Error(`Unknown file type: ${file.type}`);
const linkers = Object.entries(lambdaFiles).map(
async ([relPath, file]) => {
const newPath = join(traceDir, relPath);
// The handler was already moved into position above.
if (relPath === handlerFilePath) {
return;
}
tracedFiles.push({ absolutePath: newPath, relativePath: relPath });
const { fsPath, type } = file;
if (fsPath) {
await fs.ensureDir(dirname(newPath));
const isNewFile = newFilesEntrypoint.includes(fsPath);
const isInsideNewDirectory = newDirectoriesEntrypoint.some(
dirPath => {
return fsPath.startsWith(dirPath);
}
);
// With this, we're making sure that files in the `workPath` that existed
// before the Legacy Runtime was invoked (source files) are linked from
// `.output` instead of copying there (the latter only happens if linking fails),
// which is the fastest solution. However, files that are created fresh
// by the Legacy Runtimes are always copied, because their link destinations
// are likely to be overwritten every time an entrypoint is processed by
// the Legacy Runtime. This is likely to overwrite the destination on subsequent
// runs, but that's also how `workPath` used to work originally, without
// the File System API (meaning that there was one `workPath` for all entrypoints).
if (isNewFile || isInsideNewDirectory) {
debug(`Copying from ${fsPath} to ${newPath}`);
await fs.copy(fsPath, newPath);
} else {
await linkOrCopy(fsPath, newPath);
}
} else if (type === 'FileBlob') {
const { data, mode } = file as FileBlob;
await fs.writeFile(newPath, data, { mode });
} else {
throw new Error(`Unknown file type: ${type}`);
}
}
});
);
linkersRuntime = linkersRuntime.concat(linkers);
const nft = join(
workPath,
@@ -93,19 +284,64 @@ export function convertRuntimeToPlugin(
'pages',
`${entrypoint}.nft.json`
);
const json = JSON.stringify({
version: 1,
files: tracedFiles.map(f => ({
input: normalizePath(relative(nft, f.absolutePath)),
output: normalizePath(f.relativePath),
files: tracedFiles.map(file => ({
input: normalizePath(relative(dirname(nft), file.absolutePath)),
output: normalizePath(file.relativePath),
})),
});
await fs.ensureDir(dirname(nft));
await fs.writeFile(nft, json);
// Extend the list of directories and files that were created by the
// Legacy Runtime with the list of directories and files that were
// created for the entrypoint that was just processed above.
newPathsRuntime = new Set([
...newPathsRuntime,
...newFilesEntrypoint,
...newDirectoriesEntrypoint,
]);
const apiRouteHandler = `${parse(entry).name}.${handlerMethod}`;
// Add an entry that will later on be added to the `functions-manifest.json`
// file that is placed inside of the `.output` directory.
pages[entrypoint] = {
handler: apiRouteHandler,
runtime: output.runtime,
memory: output.memory,
maxDuration: output.maxDuration,
environment: output.environment,
allowQuery: output.allowQuery,
};
}
await updateFunctionsManifest({ vercelConfig, workPath, pages });
// Instead of of waiting for all of the linking to be done for every
// entrypoint before processing the next one, we immediately handle all
// of them one after the other, while then waiting for the linking
// to finish right here, before we clean up newly created files below.
await Promise.all(linkersRuntime);
// A list of all the files that were created by the Legacy Runtime,
// which we'd like to remove from the File System.
const toRemove = Array.from(newPathsRuntime).map(path => {
debug(`Removing ${path} as part of cleanup`);
return fs.remove(path);
});
// Once all the entrypoints have been processed, we'd like to
// remove all the files from `workPath` that originally weren't present
// before the Legacy Runtime began running, because the `workPath`
// is nowadays the directory in which the user keeps their source code, since
// we're no longer running separate parallel builds for every Legacy Runtime.
await Promise.all(toRemove);
// Add any Serverless Functions that were exposed by the Legacy Runtime
// to the `functions-manifest.json` file provided in `.output`.
await updateFunctionsManifest({ workPath, pages });
};
}
@@ -133,15 +369,12 @@ async function readJson(filePath: string): Promise<{ [key: string]: any }> {
/**
* If `.output/functions-manifest.json` exists, append to the pages
* property. Otherwise write a new file. This will also read `vercel.json`
* and apply relevant `functions` property config.
* property. Otherwise write a new file.
*/
export async function updateFunctionsManifest({
vercelConfig,
workPath,
pages,
}: {
vercelConfig: { functions?: BuilderFunctions; regions?: string[] };
workPath: string;
pages: { [key: string]: any };
}) {
@@ -156,48 +389,88 @@ export async function updateFunctionsManifest({
if (!functionsManifest.pages) functionsManifest.pages = {};
for (const [pageKey, pageConfig] of Object.entries(pages)) {
const fnConfig = await getLambdaOptionsFromFunction({
sourceFile: pageKey,
config: vercelConfig,
});
functionsManifest.pages[pageKey] = {
...pageConfig,
memory: fnConfig.memory || pageConfig.memory,
maxDuration: fnConfig.maxDuration || pageConfig.maxDuration,
regions: vercelConfig.regions || pageConfig.regions,
};
functionsManifest.pages[pageKey] = { ...pageConfig };
}
await fs.writeFile(functionsManifestPath, JSON.stringify(functionsManifest));
}
/**
* Will append routes to the `routes-manifest.json` file.
* If the file does not exist, it'll be created.
* Append routes to the `routes-manifest.json` file.
* If the file does not exist, it will be created.
*/
export async function updateRoutesManifest({
workPath,
redirects,
rewrites,
headers,
dynamicRoutes,
staticRoutes,
}: {
workPath: string;
redirects?: {
source: string;
destination: string;
statusCode: number;
regex: string;
}[];
rewrites?: {
source: string;
destination: string;
regex: string;
}[];
headers?: {
source: string;
headers: {
key: string;
value: string;
}[];
regex: string;
}[];
dynamicRoutes?: {
page: string;
regex: string;
namedRegex?: string;
routeKeys?: { [named: string]: string };
}[];
staticRoutes?: {
page: string;
regex: string;
namedRegex?: string;
routeKeys?: { [named: string]: string };
}[];
}) {
const routesManifestPath = join(workPath, '.output', 'routes-manifest.json');
const routesManifest = await readJson(routesManifestPath);
if (!routesManifest.version) routesManifest.version = 1;
if (!routesManifest.version) routesManifest.version = 3;
if (routesManifest.pages404 === undefined) routesManifest.pages404 = true;
if (redirects) {
if (!routesManifest.redirects) routesManifest.redirects = [];
routesManifest.redirects.push(...redirects);
}
if (rewrites) {
if (!routesManifest.rewrites) routesManifest.rewrites = [];
routesManifest.rewrites.push(...rewrites);
}
if (headers) {
if (!routesManifest.headers) routesManifest.headers = [];
routesManifest.headers.push(...headers);
}
if (dynamicRoutes) {
if (!routesManifest.dynamicRoutes) routesManifest.dynamicRoutes = [];
routesManifest.dynamicRoutes.push(...dynamicRoutes);
}
if (staticRoutes) {
if (!routesManifest.staticRoutes) routesManifest.staticRoutes = [];
routesManifest.staticRoutes.push(...staticRoutes);
}
await fs.writeFile(routesManifestPath, JSON.stringify(routesManifest));
}

View File

@@ -96,6 +96,7 @@ export async function detectBuilders(
redirectRoutes: Route[] | null;
rewriteRoutes: Route[] | null;
errorRoutes: Route[] | null;
limitedRoutes: LimitedRoutes | null;
}> {
const errors: ErrorResponse[] = [];
const warnings: ErrorResponse[] = [];
@@ -114,6 +115,7 @@ export async function detectBuilders(
redirectRoutes: null,
rewriteRoutes: null,
errorRoutes: null,
limitedRoutes: null,
};
}
@@ -179,6 +181,7 @@ export async function detectBuilders(
redirectRoutes: null,
rewriteRoutes: null,
errorRoutes: null,
limitedRoutes: null,
};
}
@@ -257,6 +260,7 @@ export async function detectBuilders(
defaultRoutes: null,
rewriteRoutes: null,
errorRoutes: null,
limitedRoutes: null,
};
}
@@ -299,6 +303,7 @@ export async function detectBuilders(
defaultRoutes: null,
rewriteRoutes: null,
errorRoutes: null,
limitedRoutes: null,
};
}
@@ -326,6 +331,7 @@ export async function detectBuilders(
}
const routesResult = getRouteResult(
pkg,
apiRoutes,
dynamicRoutes,
usedOutputDirectory,
@@ -342,6 +348,7 @@ export async function detectBuilders(
defaultRoutes: routesResult.defaultRoutes,
rewriteRoutes: routesResult.rewriteRoutes,
errorRoutes: routesResult.errorRoutes,
limitedRoutes: routesResult.limitedRoutes,
};
}
@@ -932,7 +939,14 @@ function createRouteFromPath(
return { route, isDynamic };
}
interface LimitedRoutes {
defaultRoutes: Route[];
redirectRoutes: Route[];
rewriteRoutes: Route[];
}
function getRouteResult(
pkg: PackageJson | undefined | null,
apiRoutes: Source[],
dynamicRoutes: Source[],
outputDirectory: string,
@@ -944,11 +958,18 @@ function getRouteResult(
redirectRoutes: Route[];
rewriteRoutes: Route[];
errorRoutes: Route[];
limitedRoutes: LimitedRoutes;
} {
const deps = Object.assign({}, pkg?.dependencies, pkg?.devDependencies);
const defaultRoutes: Route[] = [];
const redirectRoutes: Route[] = [];
const rewriteRoutes: Route[] = [];
const errorRoutes: Route[] = [];
const limitedRoutes: LimitedRoutes = {
defaultRoutes: [],
redirectRoutes: [],
rewriteRoutes: [],
};
const framework = frontendBuilder?.config?.framework || '';
const isNextjs =
framework === 'nextjs' || isOfficialRuntime('next', frontendBuilder?.use);
@@ -956,14 +977,43 @@ function getRouteResult(
if (apiRoutes && apiRoutes.length > 0) {
if (options.featHandleMiss) {
// Exclude extension names if the corresponding plugin is not found in package.json
// detectBuilders({ignoreRoutesForBuilders: ['@vercel/python']})
// return a copy of routes.
// We should exclud errorRoutes and
const extSet = detectApiExtensions(apiBuilders);
const withTag = options.tag ? `@${options.tag}` : '';
const extSetLimited = detectApiExtensions(
apiBuilders.filter(b => {
if (
b.use === `@vercel/python${withTag}` &&
!('vercel-plugin-python' in deps)
) {
return false;
}
if (
b.use === `@vercel/go${withTag}` &&
!('vercel-plugin-go' in deps)
) {
return false;
}
if (
b.use === `@vercel/ruby${withTag}` &&
!('vercel-plugin-ruby' in deps)
) {
return false;
}
return true;
})
);
if (extSet.size > 0) {
const exts = Array.from(extSet)
const extGroup = `(?:\\.(?:${Array.from(extSet)
.map(ext => ext.slice(1))
.join('|');
const extGroup = `(?:\\.(?:${exts}))`;
.join('|')}))`;
const extGroupLimited = `(?:\\.(?:${Array.from(extSetLimited)
.map(ext => ext.slice(1))
.join('|')}))`;
if (options.cleanUrls) {
redirectRoutes.push({
@@ -979,6 +1029,20 @@ function getRouteResult(
},
status: 308,
});
limitedRoutes.redirectRoutes.push({
src: `^/(api(?:.+)?)/index${extGroupLimited}?/?$`,
headers: { Location: options.trailingSlash ? '/$1/' : '/$1' },
status: 308,
});
limitedRoutes.redirectRoutes.push({
src: `^/api/(.+)${extGroupLimited}/?$`,
headers: {
Location: options.trailingSlash ? '/api/$1/' : '/api/$1',
},
status: 308,
});
} else {
defaultRoutes.push({ handle: 'miss' });
defaultRoutes.push({
@@ -986,10 +1050,18 @@ function getRouteResult(
dest: '/api/$1',
check: true,
});
limitedRoutes.defaultRoutes.push({ handle: 'miss' });
limitedRoutes.defaultRoutes.push({
src: `^/api/(.+)${extGroupLimited}$`,
dest: '/api/$1',
check: true,
});
}
}
rewriteRoutes.push(...dynamicRoutes);
limitedRoutes.rewriteRoutes.push(...dynamicRoutes);
if (typeof ignoreRuntimes === 'undefined') {
// This route is only necessary to hide the directory listing
@@ -1040,6 +1112,7 @@ function getRouteResult(
redirectRoutes,
rewriteRoutes,
errorRoutes,
limitedRoutes,
};
}

View File

@@ -0,0 +1,84 @@
import path from 'path';
import fs from 'fs-extra';
import ignore from 'ignore';
interface CodedError extends Error {
code: string;
}
function isCodedError(error: unknown): error is CodedError {
return (
error !== null &&
error !== undefined &&
(error as CodedError).code !== undefined
);
}
function clearRelative(s: string) {
return s.replace(/(\n|^)\.\//g, '$1');
}
export default async function (
downloadPath: string,
rootDirectory?: string | undefined
) {
const readFile = async (p: string) => {
try {
return await fs.readFile(p, 'utf8');
} catch (error: any) {
if (
error.code === 'ENOENT' ||
(error instanceof Error && error.message.includes('ENOENT'))
) {
return undefined;
}
throw error;
}
};
const vercelIgnorePath = path.join(
downloadPath,
rootDirectory || '',
'.vercelignore'
);
const nowIgnorePath = path.join(
downloadPath,
rootDirectory || '',
'.nowignore'
);
const ignoreContents = [];
try {
ignoreContents.push(
...(
await Promise.all([readFile(vercelIgnorePath), readFile(nowIgnorePath)])
).filter(Boolean)
);
} catch (error) {
if (isCodedError(error) && error.code === 'ENOTDIR') {
console.log(`Warning: Cannot read ignore file from ${vercelIgnorePath}`);
} else {
throw error;
}
}
if (ignoreContents.length === 2) {
throw new Error(
'Cannot use both a `.vercelignore` and `.nowignore` file. Please delete the `.nowignore` file.'
);
}
if (ignoreContents.length === 0) {
return () => false;
}
const ignoreFilter: any = ignore().add(clearRelative(ignoreContents[0]!));
return function (p: string) {
// we should not ignore now.json and vercel.json if it asked to.
// we depend on these files for building the app with sourceless
if (p === 'now.json' || p === 'vercel.json') return false;
return ignoreFilter.test(p).ignored;
};
}

View File

@@ -1,3 +1,4 @@
import { createHash } from 'crypto';
import FileBlob from './file-blob';
import FileFsRef from './file-fs-ref';
import FileRef from './file-ref';
@@ -33,6 +34,7 @@ import { NowBuildError } from './errors';
import streamToBuffer from './fs/stream-to-buffer';
import shouldServe from './should-serve';
import debug from './debug';
import getIgnoreFilter from './get-ignore-filter';
export {
FileBlob,
@@ -70,6 +72,7 @@ export {
isSymbolicLink,
getLambdaOptionsFromFunction,
scanParentDirs,
getIgnoreFilter,
};
export {
@@ -132,3 +135,11 @@ export const getPlatformEnv = (name: string): string | undefined => {
}
return n;
};
/**
* Helper function for generating file or directories names in `.output/inputs`
* for dependencies of files provided to the File System API.
*/
export const getInputHash = (source: Buffer | string): string => {
return createHash('sha1').update(source).digest('hex');
};

View File

@@ -58,6 +58,7 @@ export interface Meta {
filesRemoved?: string[];
env?: Env;
buildEnv?: Env;
avoidTopLevelInstall?: boolean;
}
export interface AnalyzeOptions {

View File

@@ -0,0 +1 @@
# users.rb

View File

@@ -1,9 +1,9 @@
{
"functions": {
"api/users/post.py": {
"api/users.rb": {
"memory": 3008
},
"api/not-matching-anything.py": {
"api/doesnt-exist.rb": {
"memory": 768
}
}

View File

@@ -0,0 +1 @@
# [id].py

View File

@@ -0,0 +1 @@
# project/[aid]/[bid]/index.py

View File

@@ -0,0 +1,7 @@
{
"functions": {
"api/users/post.py": {
"memory": 3008
}
}
}

View File

@@ -2385,13 +2385,10 @@ it('Test `detectRoutes` with `featHandleMiss=true`', async () => {
{
const files = ['api/user.go', 'api/team.js', 'api/package.json'];
const { defaultRoutes, rewriteRoutes, errorRoutes } = await detectBuilders(
files,
null,
{
const { defaultRoutes, rewriteRoutes, errorRoutes, limitedRoutes } =
await detectBuilders(files, null, {
featHandleMiss,
}
);
});
expect(defaultRoutes).toStrictEqual([
{ handle: 'miss' },
{
@@ -2414,6 +2411,22 @@ it('Test `detectRoutes` with `featHandleMiss=true`', async () => {
},
]);
// Limited routes should have js but not go since the go plugin is not installed
expect(limitedRoutes).toStrictEqual({
redirectRoutes: [],
rewriteRoutes: [],
defaultRoutes: [
{
handle: 'miss',
},
{
src: '^/api/(.+)(?:\\.(?:js))$',
dest: '/api/$1',
check: true,
},
],
});
const pattern = new RegExp(errorRoutes![0].src!);
[
@@ -2816,8 +2829,13 @@ it('Test `detectRoutes` with `featHandleMiss=true`, `cleanUrls=true`', async ()
{
const files = ['api/user.go', 'api/team.js', 'api/package.json'];
const { defaultRoutes, redirectRoutes, rewriteRoutes, errorRoutes } =
await detectBuilders(files, null, options);
const {
defaultRoutes,
redirectRoutes,
rewriteRoutes,
errorRoutes,
limitedRoutes,
} = await detectBuilders(files, null, options);
testHeaders(redirectRoutes);
expect(defaultRoutes).toStrictEqual([]);
expect(rewriteRoutes).toStrictEqual([
@@ -2834,6 +2852,28 @@ it('Test `detectRoutes` with `featHandleMiss=true`, `cleanUrls=true`', async ()
},
]);
// Limited routes should have js but not go since the go plugin is not installed
expect(limitedRoutes).toStrictEqual({
redirectRoutes: [
{
src: '^/(api(?:.+)?)/index(?:\\.(?:js))?/?$',
headers: {
Location: '/$1',
},
status: 308,
},
{
src: '^/api/(.+)(?:\\.(?:js))/?$',
headers: {
Location: '/api/$1',
},
status: 308,
},
],
rewriteRoutes: [],
defaultRoutes: [],
});
// expected redirect should match inputs
const getLocation = createReplaceLocation(redirectRoutes);
@@ -3077,7 +3117,7 @@ it('Test `detectRoutes` with `featHandleMiss=true`, `cleanUrls=true`, `trailingS
{
const files = ['api/user.go', 'api/team.js', 'api/package.json'];
const { defaultRoutes, redirectRoutes, rewriteRoutes } =
const { defaultRoutes, redirectRoutes, rewriteRoutes, limitedRoutes } =
await detectBuilders(files, null, options);
testHeaders(redirectRoutes);
expect(defaultRoutes).toStrictEqual([]);
@@ -3088,6 +3128,28 @@ it('Test `detectRoutes` with `featHandleMiss=true`, `cleanUrls=true`, `trailingS
},
]);
// Limited routes should have js but not go since the go plugin is not installed
expect(limitedRoutes).toStrictEqual({
redirectRoutes: [
{
src: '^/(api(?:.+)?)/index(?:\\.(?:js))?/?$',
headers: {
Location: '/$1/',
},
status: 308,
},
{
src: '^/api/(.+)(?:\\.(?:js))/?$',
headers: {
Location: '/api/$1/',
},
status: 308,
},
],
rewriteRoutes: [],
defaultRoutes: [],
});
// expected redirect should match inputs
const getLocation = createReplaceLocation(redirectRoutes);

View File

@@ -1,6 +1,6 @@
import { join } from 'path';
import fs from 'fs-extra';
import { BuildOptions, createLambda } from '../src';
import { BuildOptions, createLambda, FileFsRef } from '../src';
import { convertRuntimeToPlugin } from '../src/convert-runtime-to-plugin';
async function fsToJson(dir: string, output: Record<string, any> = {}) {
@@ -18,24 +18,43 @@ async function fsToJson(dir: string, output: Record<string, any> = {}) {
return output;
}
const workPath = join(__dirname, 'walk', 'python-api');
const invalidFuncWorkpath = join(
__dirname,
'convert-runtime',
'invalid-functions'
);
const pythonApiWorkpath = join(__dirname, 'convert-runtime', 'python-api');
describe('convert-runtime-to-plugin', () => {
afterEach(async () => {
await fs.remove(join(workPath, '.output'));
await fs.remove(join(invalidFuncWorkpath, '.output'));
await fs.remove(join(pythonApiWorkpath, '.output'));
});
it('should create correct fileystem for python', async () => {
const ext = '.py';
const workPath = pythonApiWorkpath;
const handlerName = 'vc__handler__python';
const handlerFileName = handlerName + ext;
const lambdaOptions = {
handler: 'index.handler',
handler: `${handlerName}.vc_handler`,
runtime: 'python3.9',
memory: 512,
maxDuration: 5,
environment: {},
regions: ['sfo1'],
};
const buildRuntime = async (opts: BuildOptions) => {
const handlerPath = join(workPath, handlerFileName);
// This is the usual time at which a Legacy Runtime writes its Lambda launcher.
await fs.writeFile(handlerPath, '# handler');
opts.files[handlerFileName] = new FileFsRef({
fsPath: handlerPath,
});
const lambda = await createLambda({
files: opts.files,
...lambdaOptions,
@@ -44,25 +63,30 @@ describe('convert-runtime-to-plugin', () => {
};
const lambdaFiles = await fsToJson(workPath);
const vercelConfig = JSON.parse(lambdaFiles['vercel.json']);
delete lambdaFiles['vercel.json'];
const build = await convertRuntimeToPlugin(buildRuntime, '.py');
const packageName = 'vercel-plugin-python';
const build = await convertRuntimeToPlugin(buildRuntime, packageName, ext);
await build({ vercelConfig, workPath });
await build({ workPath });
const output = await fsToJson(join(workPath, '.output'));
delete lambdaFiles['vercel.json'];
delete lambdaFiles['vc__handler__python.py'];
expect(output).toMatchObject({
'functions-manifest.json': expect.stringContaining('{'),
'runtime-traced-files': lambdaFiles,
inputs: {
'api-routes-python': lambdaFiles,
},
server: {
pages: {
api: {
'index.py': expect.stringContaining('index'),
'index.py': expect.stringContaining('handler'),
'index.py.nft.json': expect.stringContaining('{'),
users: {
'get.py': expect.stringContaining('get'),
'get.py': expect.stringContaining('handler'),
'get.py.nft.json': expect.stringContaining('{'),
'post.py': expect.stringContaining('post'),
'post.py': expect.stringContaining('handler'),
'post.py.nft.json': expect.stringContaining('{'),
},
},
@@ -74,9 +98,13 @@ describe('convert-runtime-to-plugin', () => {
expect(funcManifest).toMatchObject({
version: 1,
pages: {
'api/index.py': lambdaOptions,
'api/users/get.py': lambdaOptions,
'api/users/post.py': { ...lambdaOptions, memory: 3008 },
'api/index.py': { ...lambdaOptions, handler: 'index.vc_handler' },
'api/users/get.py': { ...lambdaOptions, handler: 'get.vc_handler' },
'api/users/post.py': {
...lambdaOptions,
handler: 'post.vc_handler',
memory: 512,
},
},
});
@@ -85,27 +113,35 @@ describe('convert-runtime-to-plugin', () => {
version: 1,
files: [
{
input: '../../../../runtime-traced-files/api/index.py',
input: `../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: `../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input: '../../../../runtime-traced-files/api/users/get.py',
input: `../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: `../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: '../../../../runtime-traced-files/api/users/post.py',
input: `../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: '../../../../runtime-traced-files/file.txt',
input: `../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: '../../../../runtime-traced-files/util/date.py',
input: `../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: '../../../../runtime-traced-files/util/math.py',
input: `../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
],
@@ -118,27 +154,35 @@ describe('convert-runtime-to-plugin', () => {
version: 1,
files: [
{
input: '../../../../../runtime-traced-files/api/index.py',
input: `../../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: `../../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input: '../../../../../runtime-traced-files/api/users/get.py',
input: `../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: '../../../../../runtime-traced-files/api/users/post.py',
input: `../../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: '../../../../../runtime-traced-files/file.txt',
input: `../../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: '../../../../../runtime-traced-files/util/date.py',
input: `../../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: '../../../../../runtime-traced-files/util/math.py',
input: `../../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
],
@@ -151,27 +195,35 @@ describe('convert-runtime-to-plugin', () => {
version: 1,
files: [
{
input: '../../../../../runtime-traced-files/api/index.py',
input: `../../../../inputs/api-routes-python/api/db/[id].py`,
output: 'api/db/[id].py',
},
{
input: `../../../../inputs/api-routes-python/api/index.py`,
output: 'api/index.py',
},
{
input: '../../../../../runtime-traced-files/api/users/get.py',
input: `../../../../inputs/api-routes-python/api/project/[aid]/[bid]/index.py`,
output: 'api/project/[aid]/[bid]/index.py',
},
{
input: `../../../../inputs/api-routes-python/api/users/get.py`,
output: 'api/users/get.py',
},
{
input: '../../../../../runtime-traced-files/api/users/post.py',
input: `../../../../inputs/api-routes-python/api/users/post.py`,
output: 'api/users/post.py',
},
{
input: '../../../../../runtime-traced-files/file.txt',
input: `../../../../inputs/api-routes-python/file.txt`,
output: 'file.txt',
},
{
input: '../../../../../runtime-traced-files/util/date.py',
input: `../../../../inputs/api-routes-python/util/date.py`,
output: 'util/date.py',
},
{
input: '../../../../../runtime-traced-files/util/math.py',
input: `../../../../inputs/api-routes-python/util/math.py`,
output: 'util/math.py',
},
],

View File

@@ -1,6 +1,6 @@
{
"name": "vercel",
"version": "23.1.3-canary.39",
"version": "23.1.3-canary.56",
"preferGlobal": true,
"license": "Apache-2.0",
"description": "The command-line interface for Vercel",
@@ -43,14 +43,14 @@
"node": ">= 12"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.21",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/go": "1.2.4-canary.4",
"@vercel/node": "1.12.2-canary.7",
"@vercel/python": "2.1.2-canary.0",
"@vercel/ruby": "1.2.8-canary.4",
"@vercel/python": "2.1.2-canary.1",
"@vercel/ruby": "1.2.8-canary.6",
"update-notifier": "4.1.0",
"vercel-plugin-middleware": "0.0.0-canary.7",
"vercel-plugin-node": "1.12.2-canary.12"
"vercel-plugin-middleware": "0.0.0-canary.10",
"vercel-plugin-node": "1.12.2-canary.26"
},
"devDependencies": {
"@next/env": "11.1.2",
@@ -90,7 +90,7 @@
"@types/update-notifier": "5.1.0",
"@types/which": "1.3.2",
"@types/write-json-file": "2.2.1",
"@vercel/frameworks": "0.5.1-canary.13",
"@vercel/frameworks": "0.5.1-canary.16",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.17.0",
"@zeit/fun": "0.11.2",

View File

@@ -5,6 +5,7 @@ import {
GlobOptions,
scanParentDirs,
spawnAsync,
glob as buildUtilsGlob,
} from '@vercel/build-utils';
import { nodeFileTrace } from '@vercel/nft';
import Sema from 'async-sema';
@@ -14,7 +15,7 @@ import { assert } from 'console';
import { createHash } from 'crypto';
import fs from 'fs-extra';
import ogGlob from 'glob';
import { isAbsolute, join, parse, relative, resolve } from 'path';
import { dirname, isAbsolute, join, parse, relative, resolve } from 'path';
import pluralize from 'pluralize';
import Client from '../util/client';
import { VercelConfig } from '../util/dev/types';
@@ -136,9 +137,11 @@ export default async function main(client: Client) {
});
// Set process.env with loaded environment variables
await processEnv(loadedEnvFiles);
processEnv(loadedEnvFiles);
const spawnOpts = {
const spawnOpts: {
env: Record<string, string | undefined>;
} = {
env: { ...combinedEnv, VERCEL: '1' },
};
@@ -284,6 +287,21 @@ export default async function main(client: Client) {
// Clean the output directory
fs.removeSync(join(cwd, OUTPUT_DIR));
if (framework && process.env.VERCEL_URL && 'envPrefix' in framework) {
for (const key of Object.keys(process.env)) {
if (key.startsWith('VERCEL_')) {
const newKey = `${framework.envPrefix}${key}`;
// Set `process.env` and `spawnOpts.env` to make sure the variables are
// available to the `build` step and the CLI Plugins.
process.env[newKey] = process.env[newKey] || process.env[key];
spawnOpts.env[newKey] = process.env[newKey];
}
}
}
// Required for Next.js to produce the correct `.nft.json` files.
spawnOpts.env.NEXT_PRIVATE_OUTPUT_TRACE_ROOT = baseDir;
// Yarn v2 PnP mode may be activated, so force
// "node-modules" linker style
const env = {
@@ -315,22 +333,47 @@ export default async function main(client: Client) {
cwd,
});
}
// don't trust framework detection here because they might be switching to next on a branch
const isNextJs = fs.existsSync(join(cwd, '.next'));
if (!fs.existsSync(join(cwd, OUTPUT_DIR))) {
let outputDir = join(OUTPUT_DIR, 'static');
let distDir = await framework.getFsOutputDir(cwd);
if (isNextJs) {
outputDir = OUTPUT_DIR;
let dotNextDir: string | null = null;
// If a custom `outputDirectory` was set, we'll need to verify
// if it's `.next` output, or just static output.
const userOutputDirectory = project.settings.outputDirectory;
if (typeof userOutputDirectory === 'string') {
if (fs.existsSync(join(cwd, userOutputDirectory, 'BUILD_ID'))) {
dotNextDir = join(cwd, userOutputDirectory);
client.output.debug(
`Consider ${param(userOutputDirectory)} as ${param('.next')} output.`
);
}
} else if (fs.existsSync(join(cwd, '.next'))) {
dotNextDir = join(cwd, '.next');
client.output.debug(`Found ${param('.next')} directory.`);
}
const copyStamp = stamp();
// We cannot rely on the `framework` alone, as it might be a static export,
// and the current build might use a differnt project that's not in the settings.
const isNextOutput = Boolean(dotNextDir);
const nextExport = await getNextExportStatus(dotNextDir);
const outputDir =
isNextOutput && !nextExport ? OUTPUT_DIR : join(OUTPUT_DIR, 'static');
const distDir =
(nextExport?.exportDetail.outDirectory
? relative(cwd, nextExport.exportDetail.outDirectory)
: false) ||
dotNextDir ||
userOutputDirectory ||
(await framework.getFsOutputDir(cwd));
await fs.ensureDir(join(cwd, outputDir));
const relativeDistDir = relative(cwd, distDir);
const copyStamp = stamp();
client.output.spinner(
`Copying files from ${param(distDir)} to ${param(outputDir)}`
);
const files = await glob(join(relativeDistDir, '**'), {
const files = await glob(join(relative(cwd, distDir), '**'), {
ignore: [
'node_modules/**',
'.vercel/**',
@@ -378,6 +421,7 @@ export default async function main(client: Client) {
`Generating build manifest: ${param(buildManifestPath)}`
);
const buildManifest = {
version: 1,
cache: framework.cachePattern ? [framework.cachePattern] : [],
};
await fs.writeJSON(buildManifestPath, buildManifest, { spaces: 2 });
@@ -405,7 +449,53 @@ export default async function main(client: Client) {
}
// Special Next.js processing.
if (isNextJs) {
if (nextExport) {
client.output.debug('Found `next export` output.');
const htmlFiles = await buildUtilsGlob(
'**/*.html',
join(cwd, OUTPUT_DIR, 'static')
);
if (nextExport.exportDetail.success !== true) {
client.output.error(
`Export of Next.js app failed. Please check your build logs.`
);
process.exit(1);
}
await fs.mkdirp(join(cwd, OUTPUT_DIR, 'server', 'pages'));
await fs.mkdirp(join(cwd, OUTPUT_DIR, 'static'));
await Promise.all(
Object.keys(htmlFiles).map(async fileName => {
await sema.acquire();
const input = join(cwd, OUTPUT_DIR, 'static', fileName);
const target = join(cwd, OUTPUT_DIR, 'server', 'pages', fileName);
await fs.mkdirp(dirname(target));
await fs.promises.rename(input, target).finally(() => {
sema.release();
});
})
);
for (const file of [
'BUILD_ID',
'images-manifest.json',
'routes-manifest.json',
'build-manifest.json',
]) {
const input = join(nextExport.dotNextDir, file);
if (fs.existsSync(input)) {
// Do not use `smartCopy`, since we want to overwrite if they already exist.
await fs.copyFile(input, join(OUTPUT_DIR, file));
}
}
} else if (isNextOutput) {
// The contents of `.output/static` should be placed inside of `.output/static/_next/static`
const tempStatic = '___static';
await fs.rename(
@@ -456,10 +546,12 @@ export default async function main(client: Client) {
// `public`, then`static`). We can't read both at the same time because that would mean we'd
// read public for old Next.js versions that don't support it, which might be breaking (and
// we don't want to make vercel build specific framework versions).
const nextSrcDirectory = dirname(distDir);
const publicFiles = await glob('public/**', {
nodir: true,
dot: true,
cwd,
cwd: nextSrcDirectory,
absolute: true,
});
if (publicFiles.length > 0) {
@@ -468,7 +560,11 @@ export default async function main(client: Client) {
smartCopy(
client,
f,
f.replace('public', join(OUTPUT_DIR, 'static'))
join(
OUTPUT_DIR,
'static',
relative(join(dirname(distDir), 'public'), f)
)
)
)
);
@@ -476,7 +572,7 @@ export default async function main(client: Client) {
const staticFiles = await glob('static/**', {
nodir: true,
dot: true,
cwd,
cwd: nextSrcDirectory,
absolute: true,
});
await Promise.all(
@@ -484,7 +580,12 @@ export default async function main(client: Client) {
smartCopy(
client,
f,
f.replace('static', join(OUTPUT_DIR, 'static', 'static'))
join(
OUTPUT_DIR,
'static',
'static',
relative(join(dirname(distDir), 'static'), f)
)
)
)
);
@@ -503,6 +604,7 @@ export default async function main(client: Client) {
const nftFiles = await glob(join(OUTPUT_DIR, '**', '*.nft.json'), {
nodir: true,
dot: true,
ignore: ['cache/**'],
cwd,
absolute: true,
});
@@ -539,6 +641,7 @@ export default async function main(client: Client) {
baseDir,
outputDir: OUTPUT_DIR,
nftFileName: f.replace(ext, '.js.nft.json'),
distDir,
nft: {
version: 1,
files: Array.from(fileList).map(fileListEntry =>
@@ -556,6 +659,7 @@ export default async function main(client: Client) {
outputDir: OUTPUT_DIR,
nftFileName: f,
nft: json,
distDir,
});
}
}
@@ -564,22 +668,33 @@ export default async function main(client: Client) {
OUTPUT_DIR,
'required-server-files.json'
);
const requiredServerFilesJson = await fs.readJSON(
requiredServerFilesPath
);
await fs.writeJSON(requiredServerFilesPath, {
...requiredServerFilesJson,
appDir: '.',
files: requiredServerFilesJson.files.map((i: string) => {
const absolutePath = join(cwd, i.replace('.next', '.output'));
const output = relative(baseDir, absolutePath);
return {
input: i.replace('.next', '.output'),
output,
};
}),
});
if (fs.existsSync(requiredServerFilesPath)) {
client.output.debug(`Resolve ${param('required-server-files.json')}.`);
const requiredServerFilesJson = await fs.readJSON(
requiredServerFilesPath
);
await fs.writeJSON(requiredServerFilesPath, {
...requiredServerFilesJson,
appDir: '.',
files: requiredServerFilesJson.files.map((i: string) => {
const originalPath = join(requiredServerFilesJson.appDir, i);
const relPath = join(OUTPUT_DIR, relative(distDir, originalPath));
const absolutePath = join(cwd, relPath);
const output = relative(baseDir, absolutePath);
return relPath === output
? relPath
: {
input: relPath,
output,
};
}),
});
}
}
}
@@ -758,24 +873,37 @@ async function resolveNftToOutput({
baseDir,
outputDir,
nftFileName,
distDir,
nft,
}: {
client: Client;
baseDir: string;
outputDir: string;
nftFileName: string;
distDir: string;
nft: NftFile;
}) {
client.output.debug(`Processing and resolving ${nftFileName}`);
await fs.ensureDir(join(outputDir, 'inputs'));
const newFilesList: NftFile['files'] = [];
// If `distDir` is a subdirectory, then the input has to be resolved to where the `.output` directory will be.
const relNftFileName = relative(outputDir, nftFileName);
const origNftFilename = join(distDir, relNftFileName);
if (relNftFileName.startsWith('cache/')) {
// No need to process the `cache/` directory.
// Paths in it might also not be relative to `cache` itself.
return;
}
for (let fileEntity of nft.files) {
const relativeInput: string =
const relativeInput =
typeof fileEntity === 'string' ? fileEntity : fileEntity.input;
const fullInput = resolve(join(parse(nftFileName).dir, relativeInput));
const fullInput = resolve(join(parse(origNftFilename).dir, relativeInput));
// if the resolved path is NOT in the .output directory we move in it there
if (!fullInput.includes(outputDir)) {
if (!fullInput.includes(distDir)) {
const { ext } = parse(fullInput);
const raw = await fs.readFile(fullInput);
const newFilePath = join(outputDir, 'inputs', hash(raw) + ext);
@@ -801,3 +929,53 @@ async function resolveNftToOutput({
files: newFilesList,
});
}
/**
* Files will only exist when `next export` was used.
*/
async function getNextExportStatus(dotNextDir: string | null) {
if (!dotNextDir) {
return null;
}
const exportDetail: {
success: boolean;
outDirectory: string;
} | null = await fs
.readJson(join(dotNextDir, 'export-detail.json'))
.catch(error => {
if (error.code === 'ENOENT') {
return null;
}
throw error;
});
if (!exportDetail) {
return null;
}
const exportMarker: {
version: 1;
exportTrailingSlash: boolean;
hasExportPathMap: boolean;
} | null = await fs
.readJSON(join(dotNextDir, 'export-marker.json'))
.catch(error => {
if (error.code === 'ENOENT') {
return null;
}
throw error;
});
return {
dotNextDir,
exportDetail,
exportMarker: {
trailingSlash: exportMarker?.hasExportPathMap
? exportMarker.exportTrailingSlash
: false,
},
};
}

View File

@@ -968,7 +968,7 @@ export default class DevServer {
socket.destroy();
return;
}
const target = `http://localhost:${this.devProcessPort}`;
const target = `http://127.0.0.1:${this.devProcessPort}`;
this.output.debug(`Detected "upgrade" event, proxying to ${target}`);
this.proxy.ws(req, socket, head, { target });
});
@@ -1663,7 +1663,7 @@ export default class DevServer {
if (!match) {
// If the dev command is started, then proxy to it
if (this.devProcessPort) {
const upstream = `http://localhost:${this.devProcessPort}`;
const upstream = `http://127.0.0.1:${this.devProcessPort}`;
debug(`Proxying to frontend dev server: ${upstream}`);
// Add the Vercel platform proxy request headers
@@ -1810,7 +1810,7 @@ export default class DevServer {
return proxyPass(
req,
res,
`http://localhost:${port}`,
`http://127.0.0.1:${port}`,
this,
requestId,
false
@@ -1847,7 +1847,7 @@ export default class DevServer {
return proxyPass(
req,
res,
`http://localhost:${this.devProcessPort}`,
`http://127.0.0.1:${this.devProcessPort}`,
this,
requestId,
false

View File

@@ -8,6 +8,7 @@ export type ProjectLinkAndSettings = ProjectLink & {
buildCommand: Project['buildCommand'];
devCommand: Project['devCommand'];
outputDirectory: Project['outputDirectory'];
directoryListing: Project['directoryListing'];
rootDirectory: Project['rootDirectory'];
framework: Project['framework'];
};
@@ -29,6 +30,7 @@ export async function writeProjectSettings(
settings: {
buildCommand: project.buildCommand,
devCommand: project.devCommand,
outputDirectory: project.outputDirectory,
directoryListing: project.directoryListing,
rootDirectory: project.rootDirectory,
framework: project.framework,

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/client",
"version": "10.2.3-canary.22",
"version": "10.2.3-canary.35",
"main": "dist/index.js",
"typings": "dist/index.d.ts",
"homepage": "https://vercel.com",
@@ -40,7 +40,7 @@
]
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.21",
"@vercel/build-utils": "2.12.3-canary.34",
"@zeit/fetch": "5.2.0",
"async-retry": "1.2.3",
"async-sema": "3.0.0",

View File

@@ -0,0 +1,6 @@
<svg viewBox="0 0 800 800" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M700 0H100C44.772 0 0 44.772 0 100v600c0 55.228 44.772 100 100 100h600c55.228 0 100-44.772 100-100V100C800 44.772 755.228 0 700 0Z" fill="#212121"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M587.947 527.768c4.254 54.65 4.254 80.268 4.254 108.232H465.756c0-6.091.109-11.663.219-17.313.342-17.564.699-35.88-2.147-72.868-3.761-54.152-27.08-66.185-69.957-66.185H195v-98.525h204.889c54.16 0 81.241-16.476 81.241-60.098 0-38.357-27.081-61.601-81.241-61.601H195V163h227.456C545.069 163 606 220.912 606 313.42c0 69.193-42.877 114.319-100.799 121.84 48.895 9.777 77.48 37.605 82.746 92.508Z" fill="#fff"/>
<path d="M195 636v-73.447h133.697c22.332 0 27.181 16.563 27.181 26.441V636H195Z" fill="#fff"/>
<path d="M194.5 636v.5h161.878v-47.506c0-5.006-1.226-11.734-5.315-17.224-4.108-5.515-11.059-9.717-22.366-9.717H194.5V636Z" stroke="#fff" stroke-opacity=".8"/>
</svg>

After

Width:  |  Height:  |  Size: 958 B

View File

@@ -1,25 +1,6 @@
<svg width="800" height="800" viewBox="0 0 800 800" fill="none" xmlns="http://www.w3.org/2000/svg">
<rect width="800" height="800" fill="#212121"/>
<g filter="url(#filter0_dd_126_53)">
<path fill-rule="evenodd" clip-rule="evenodd" d="M587.947 527.768C592.201 582.418 592.201 608.036 592.201 636H465.756C465.756 629.909 465.865 624.337 465.975 618.687C466.317 601.123 466.674 582.807 463.828 545.819C460.067 491.667 436.748 479.634 393.871 479.634H355.883H195V381.109H399.889C454.049 381.109 481.13 364.633 481.13 321.011C481.13 282.654 454.049 259.41 399.889 259.41H195V163H422.456C545.069 163 606 220.912 606 313.42C606 382.613 563.123 427.739 505.201 435.26C554.096 445.037 582.681 472.865 587.947 527.768Z" fill="#E8F2FF"/>
<path d="M195 636V562.553H328.697C351.029 562.553 355.878 579.116 355.878 588.994V636H195Z" fill="#E8F2FF"/>
</g>
<defs>
<filter id="filter0_dd_126_53" x="131" y="99" width="539" height="601" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset/>
<feGaussianBlur stdDeviation="28"/>
<feComposite in2="hardAlpha" operator="out"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.223529 0 0 0 0 0.572549 0 0 0 0 1 0 0 0 1 0"/>
<feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_126_53"/>
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
<feOffset/>
<feGaussianBlur stdDeviation="32"/>
<feComposite in2="hardAlpha" operator="out"/>
<feColorMatrix type="matrix" values="0 0 0 0 0.223529 0 0 0 0 0.572549 0 0 0 0 1 0 0 0 0.9 0"/>
<feBlend mode="normal" in2="effect1_dropShadow_126_53" result="effect2_dropShadow_126_53"/>
<feBlend mode="normal" in="SourceGraphic" in2="effect2_dropShadow_126_53" result="shape"/>
</filter>
</defs>
<svg viewBox="0 0 800 800" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M700 0H100C44.772 0 0 44.772 0 100v600c0 55.228 44.772 100 100 100h600c55.228 0 100-44.772 100-100V100C800 44.772 755.228 0 700 0Z" fill="#212121"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M587.947 527.768c4.254 54.65 4.254 80.268 4.254 108.232H465.756c0-6.091.109-11.663.219-17.313.342-17.564.699-35.88-2.147-72.868-3.761-54.152-27.08-66.185-69.957-66.185H195v-98.525h204.889c54.16 0 81.241-16.476 81.241-60.098 0-38.357-27.081-61.601-81.241-61.601H195V163h227.456C545.069 163 606 220.912 606 313.42c0 69.193-42.877 114.319-100.799 121.84 48.895 9.777 77.48 37.605 82.746 92.508Z" fill="#fff"/>
<path d="M195 636v-73.447h133.697c22.332 0 27.181 16.563 27.181 26.441V636H195Z" fill="#fff"/>
<path d="M194.5 636v.5h161.878v-47.506c0-5.006-1.226-11.734-5.315-17.224-4.108-5.515-11.059-9.717-22.366-9.717H194.5V636Z" stroke="#fff" stroke-opacity=".8"/>
</svg>

Before

Width:  |  Height:  |  Size: 1.9 KiB

After

Width:  |  Height:  |  Size: 958 B

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/frameworks",
"version": "0.5.1-canary.13",
"version": "0.5.1-canary.16",
"main": "./dist/frameworks.js",
"types": "./dist/frameworks.d.ts",
"files": [

View File

@@ -195,7 +195,7 @@ export const frameworks = [
name: 'Remix',
slug: 'remix',
demo: 'https://remix.examples.vercel.com',
logo: 'https://raw.githubusercontent.com/vercel/vercel/main/packages/frameworks/logos/remix.svg',
logo: 'https://raw.githubusercontent.com/vercel/vercel/main/packages/frameworks/logos/remix-no-shadow.svg',
tagline: 'Build Better Websites',
description: 'A new Remix app — the result of running `npx create-remix`.',
website: 'https://remix.run',
@@ -251,8 +251,8 @@ export const frameworks = [
],
defaultHeaders: [
{
source: '^/build/(.*)$',
regex: '^/build/(.*)$',
source: '/build/(.*)',
regex: '/build/(.*)',
headers: [
{ key: 'cache-control', value: 'public, max-age=31536000, immutable' },
],

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-middleware",
"version": "0.0.0-canary.7",
"version": "0.0.0-canary.10",
"license": "MIT",
"main": "./dist/index",
"homepage": "",
@@ -30,6 +30,7 @@
"@types/node-fetch": "^2",
"@types/ua-parser-js": "0.7.36",
"@types/uuid": "8.3.1",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/ncc": "0.24.0",
"cookie": "0.4.1",
"formdata-node": "4.3.1",

View File

@@ -5,6 +5,7 @@ import { promises as fsp } from 'fs';
import { IncomingMessage, ServerResponse } from 'http';
import libGlob from 'glob';
import Proxy from 'http-proxy';
import { updateFunctionsManifest } from '@vercel/build-utils';
import { run } from './websandbox';
import type { FetchEventResult } from './websandbox/types';
@@ -73,26 +74,20 @@ export async function build({ workPath }: { workPath: string }) {
await fsp.unlink(entriesPath);
}
// Write middleware manifest
const middlewareManifest = {
version: 1,
sortedMiddleware: ['/'],
middleware: {
'/': {
env: [],
files: ['server/pages/_middleware.js'],
name: 'pages/_middleware',
page: '/',
regexp: '^/.*$',
},
},
const fileName = basename(middlewareFile);
const pages: { [key: string]: any } = {};
pages[fileName] = {
runtime: 'web',
env: [],
files: ['server/pages/_middleware.js'],
name: 'pages/_middleware',
page: '/',
regexp: '^/.*$',
sortingIndex: 1,
};
const middlewareManifestData = JSON.stringify(middlewareManifest, null, 2);
const middlewareManifestPath = join(
workPath,
'.output/server/middleware-manifest.json'
);
await fsp.writeFile(middlewareManifestPath, middlewareManifestData);
await updateFunctionsManifest({ workPath, pages });
}
const stringifyQuery = (req: IncomingMessage, query: ParsedUrlQuery) => {

View File

@@ -2,8 +2,8 @@
exports[`build() should build simple middleware 1`] = `
Object {
"middleware": Object {
"/": Object {
"pages": Object {
"_middleware.js": Object {
"env": Array [],
"files": Array [
"server/pages/_middleware.js",
@@ -11,11 +11,10 @@ Object {
"name": "pages/_middleware",
"page": "/",
"regexp": "^/.*$",
"runtime": "web",
"sortingIndex": 1,
},
},
"sortedMiddleware": Array [
"/",
],
"version": 1,
}
`;

View File

@@ -22,7 +22,7 @@ describe('build()', () => {
const middlewareManifest = JSON.parse(
await fsp.readFile(
join(fixture, '.output/server/middleware-manifest.json'),
join(fixture, '.output/functions-manifest.json'),
'utf8'
)
);

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-go",
"version": "1.0.0-canary.6",
"version": "1.0.0-canary.22",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,7 +17,7 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.21",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/go": "1.2.4-canary.4"
},
"devDependencies": {

View File

@@ -1,6 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as go from '@vercel/go';
export const build = convertRuntimeToPlugin(go.build, '.go');
export const build = convertRuntimeToPlugin(go.build, 'vercel-plugin-go', '.go');
export const startDevServer = go.startDevServer;

View File

@@ -1,6 +1,6 @@
{
"name": "vercel-plugin-node",
"version": "1.12.2-canary.12",
"version": "1.12.2-canary.26",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/node-js",
@@ -34,12 +34,12 @@
"@types/node-fetch": "2",
"@types/test-listen": "1.1.0",
"@types/yazl": "2.4.2",
"@vercel/build-utils": "2.12.3-canary.21",
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/fun": "1.0.3",
"@vercel/ncc": "0.24.0",
"@vercel/nft": "0.14.0",
"@vercel/node-bridge": "2.1.1-canary.2",
"@vercel/static-config": "0.0.1-canary.0",
"@vercel/static-config": "0.0.1-canary.1",
"abort-controller": "3.0.0",
"content-type": "1.0.4",
"cookie": "0.4.0",

View File

@@ -40,6 +40,7 @@ import {
walkParentDirs,
normalizePath,
runPackageJsonScript,
getInputHash,
} from '@vercel/build-utils';
import { FromSchema } from 'json-schema-to-ts';
import { getConfig, BaseFunctionConfigSchema } from '@vercel/static-config';
@@ -47,8 +48,6 @@ import { AbortController } from 'abort-controller';
import { Register, register } from './typescript';
import { pageToRoute } from './router/page-to-route';
import { isDynamicRoute } from './router/is-dynamic';
import crypto from 'crypto';
import type { VercelConfig } from '@vercel/client';
export { shouldServe };
export {
@@ -380,13 +379,7 @@ function getAWSLambdaHandler(entrypoint: string, config: FunctionConfig) {
}
// TODO NATE: turn this into a `@vercel/plugin-utils` helper function?
export async function build({
vercelConfig,
workPath,
}: {
vercelConfig: VercelConfig;
workPath: string;
}) {
export async function build({ workPath }: { workPath: string }) {
const project = new Project();
const entrypoints = await glob('api/**/*.[jt]s', workPath);
const installedPaths = new Set<string>();
@@ -408,14 +401,13 @@ export async function build({
getConfig(project, absEntrypoint, FunctionConfigSchema) || {};
// No config exported means "node", but if there is a config
// and "runtime" is defined, but it is not "node" then don't
// and "use" is defined, but it is not "node" then don't
// compile this file.
if (config.runtime && config.runtime !== 'node') {
if (config.use && config.use !== 'node') {
continue;
}
await buildEntrypoint({
vercelConfig,
workPath,
entrypoint,
config,
@@ -425,23 +417,18 @@ export async function build({
}
export async function buildEntrypoint({
vercelConfig,
workPath,
entrypoint,
config,
installedPaths,
}: {
vercelConfig: VercelConfig;
workPath: string;
entrypoint: string;
config: FunctionConfig;
installedPaths?: Set<string>;
}) {
// Unique hash that will be used as directory name for `.output`.
const entrypointHash = crypto
.createHash('sha256')
.update(entrypoint)
.digest('hex');
const entrypointHash = 'api-routes-node-' + getInputHash(entrypoint);
const outputDirPath = join(workPath, '.output');
const { dir, name } = parsePath(entrypoint);
@@ -561,7 +548,7 @@ export async function buildEntrypoint({
runtime: nodeVersion.runtime,
},
};
await updateFunctionsManifest({ vercelConfig, workPath, pages });
await updateFunctionsManifest({ workPath, pages });
// Update the `routes-mainifest.json` file with the wildcard route
// when the entrypoint is dynamic (i.e. `/api/[id].ts`).

View File

@@ -143,16 +143,7 @@ function withFixture<T>(
await runNpmInstall(fixture);
}
let vercelConfig = {};
try {
vercelConfig = JSON.parse(
await fsp.readFile(path.join(fixture, 'vercel.json'), 'utf8')
);
} catch (e) {
// Consume error
}
await build({ vercelConfig, workPath: fixture });
await build({ workPath: fixture });
try {
return await t({ fixture, fetch });

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-python",
"version": "1.0.0-canary.7",
"version": "1.0.0-canary.23",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,8 +17,8 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.21",
"@vercel/python": "2.1.2-canary.0"
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/python": "2.1.2-canary.1"
},
"devDependencies": {
"@types/node": "*",

View File

@@ -1,6 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as python from '@vercel/python';
export const build = convertRuntimeToPlugin(python.build, '.py');
export const build = convertRuntimeToPlugin(python.build, 'vercel-plugin-python', '.py');
//export const startDevServer = python.startDevServer;

View File

@@ -1,7 +1,7 @@
{
"private": false,
"name": "vercel-plugin-ruby",
"version": "1.0.0-canary.5",
"version": "1.0.0-canary.21",
"main": "dist/index.js",
"license": "MIT",
"files": [
@@ -17,8 +17,8 @@
"prepublishOnly": "tsc"
},
"dependencies": {
"@vercel/build-utils": "2.12.3-canary.21",
"@vercel/ruby": "1.2.8-canary.4"
"@vercel/build-utils": "2.12.3-canary.34",
"@vercel/ruby": "1.2.8-canary.6"
},
"devDependencies": {
"@types/node": "*",

View File

@@ -1,6 +1,6 @@
import { convertRuntimeToPlugin } from '@vercel/build-utils';
import * as ruby from '@vercel/ruby';
export const build = convertRuntimeToPlugin(ruby.build, '.rb');
export const build = convertRuntimeToPlugin(ruby.build, 'vercel-plugin-ruby', '.rb');
//export const startDevServer = ruby.startDevServer;

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/python",
"version": "2.1.2-canary.0",
"version": "2.1.2-canary.1",
"main": "./dist/index.js",
"license": "MIT",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/python",

View File

@@ -1,3 +1,4 @@
import { relative, basename } from 'path';
import execa from 'execa';
import { Meta, debug } from '@vercel/build-utils';
@@ -135,6 +136,18 @@ export async function installRequirementsFile({
meta,
args = [],
}: InstallRequirementsFileArg) {
const fileAtRoot = relative(workPath, filePath) === basename(filePath);
// If the `requirements.txt` file is located in the Root Directory of the project and
// the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed its dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall && fileAtRoot) {
debug(
`Skipping requirements file installation, already installed by Install Command`
);
return;
}
if (
meta.isDev &&
(await areRequirementsInstalled(pythonPath, filePath, workPath))

View File

@@ -1,4 +1,4 @@
import { join, dirname } from 'path';
import { join, dirname, relative } from 'path';
import execa from 'execa';
import {
ensureDir,
@@ -85,10 +85,12 @@ export async function build({
}: BuildOptions) {
await download(files, workPath, meta);
const entrypointFsDirname = join(workPath, dirname(entrypoint));
const gemfileName = 'Gemfile';
const gemfilePath = await walkParentDirs({
base: workPath,
start: entrypointFsDirname,
filename: 'Gemfile',
filename: gemfileName,
});
const gemfileContents = gemfilePath
? await readFile(gemfilePath, 'utf8')
@@ -130,15 +132,24 @@ export async function build({
'did not find a vendor directory but found a Gemfile, bundling gems...'
);
// try installing. this won't work if native extesions are required.
// if that's the case, gems should be vendored locally before deploying.
try {
await bundleInstall(bundlerPath, bundleDir, gemfilePath);
} catch (err) {
debug(
'unable to build gems from Gemfile. vendor the gems locally with "bundle install --deployment" and retry.'
);
throw err;
const fileAtRoot = relative(workPath, gemfilePath) === gemfileName;
// If the `Gemfile` is located in the Root Directory of the project and
// the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed its dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall && fileAtRoot) {
debug('Skipping `bundle install` — already handled by Install Command');
} else {
// try installing. this won't work if native extesions are required.
// if that's the case, gems should be vendored locally before deploying.
try {
await bundleInstall(bundlerPath, bundleDir, gemfilePath);
} catch (err) {
debug(
'unable to build gems from Gemfile. vendor the gems locally with "bundle install --deployment" and retry.'
);
throw err;
}
}
}
} else {

View File

@@ -66,6 +66,23 @@ export async function installBundler(meta: Meta, gemfileContents: string) {
gemfileContents
);
// If the new File System API is used (`avoidTopLevelInstall`), the Install Command
// will have already installed the dependencies, so we don't need to do it again.
if (meta.avoidTopLevelInstall) {
debug(
`Skipping bundler installation, already installed by Install Command`
);
return {
gemHome,
rubyPath,
gemPath,
vendorPath,
runtime,
bundlerPath: join(gemHome, 'bin', 'bundler'),
};
}
debug('installing bundler...');
await execa(gemPath, ['install', 'bundler', '--no-document'], {
stdio: 'pipe',

View File

@@ -1,7 +1,7 @@
{
"name": "@vercel/ruby",
"author": "Nathan Cahill <nathan@nathancahill.com>",
"version": "1.2.8-canary.4",
"version": "1.2.8-canary.6",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/ruby",

View File

@@ -0,0 +1,2 @@
---
BUNDLE_PATH: "vendor/bundle"

View File

@@ -2,6 +2,6 @@
source "https://rubygems.org"
ruby "~> 2.5.0"
ruby "~> 2.7.0"
gem "cowsay", "~> 0.3.0"

View File

@@ -0,0 +1,16 @@
GEM
remote: https://rubygems.org/
specs:
cowsay (0.3.0)
PLATFORMS
x86_64-darwin-21
DEPENDENCIES
cowsay (~> 0.3.0)
RUBY VERSION
ruby 2.7.5p203
BUNDLED WITH
2.2.22

View File

@@ -1,6 +1,6 @@
{
"version": 2,
"builds": [{ "src": "index.rb", "use": "@vercel/ruby" }],
"build": { "env": { "RUBY_VERSION": "2.5.x" } },
"build": { "env": { "RUBY_VERSION": "2.7.x" } },
"probes": [{ "path": "/", "mustContain": "gem:RANDOMNESS_PLACEHOLDER" }]
}

View File

@@ -14,19 +14,17 @@ Gem::Specification.new do |s|
s.executables = ["cowsay".freeze]
s.files = ["bin/cowsay".freeze]
s.homepage = "https://github.com/moneydesktop/cowsay".freeze
s.rubygems_version = "3.0.3".freeze
s.rubygems_version = "3.2.22".freeze
s.summary = "ASCII art avatars emote your messages".freeze
s.installed_by_version = "3.0.3" if s.respond_to? :installed_by_version
s.installed_by_version = "3.2.22" if s.respond_to? :installed_by_version
if s.respond_to? :specification_version then
s.specification_version = 4
end
if Gem::Version.new(Gem::VERSION) >= Gem::Version.new('1.2.0') then
s.add_development_dependency(%q<rake>.freeze, [">= 0"])
else
s.add_dependency(%q<rake>.freeze, [">= 0"])
end
if s.respond_to? :add_runtime_dependency then
s.add_development_dependency(%q<rake>.freeze, [">= 0"])
else
s.add_dependency(%q<rake>.freeze, [">= 0"])
end

View File

@@ -0,0 +1,2 @@
---
BUNDLE_PATH: "vendor/bundle"

View File

@@ -1,7 +1,7 @@
{
"version": 2,
"builds": [{ "src": "project/index.rb", "use": "@vercel/ruby" }],
"build": { "env": { "RUBY_VERSION": "2.5.x" } },
"build": { "env": { "RUBY_VERSION": "2.7.x" } },
"probes": [
{ "path": "/project/", "mustContain": "gem:RANDOMNESS_PLACEHOLDER" }
]

View File

@@ -0,0 +1,2 @@
---
BUNDLE_PATH: "vendor/bundle"

Some files were not shown because too many files have changed in this diff Show More