Compare commits

...

38 Commits

Author SHA1 Message Date
Steven
32afd67d29 Publish Stable
- @vercel/build-utils@5.2.0
 - vercel@27.3.7
 - @vercel/client@12.1.10
 - @vercel/edge@0.0.3
 - @vercel/frameworks@1.1.3
 - @vercel/fs-detectors@2.0.5
 - @vercel/go@2.0.15
 - @vercel/hydrogen@0.0.12
 - @vercel/next@3.1.15
 - @vercel/node@2.5.6
 - @vercel/python@3.1.7
 - @vercel/redwood@1.0.16
 - @vercel/remix@1.0.17
 - @vercel/routing-utils@2.0.2
 - @vercel/ruby@1.3.23
 - @vercel/static-build@1.0.16
 - @vercel/static-config@2.0.3
2022-08-04 15:11:10 -04:00
Nathan Rajlich
7523e39f18 [tests] Remove leftover debugging .pipe() call in vc build tests (#8317) 2022-08-04 17:51:48 +00:00
Craig Andrews
99f2f2f1ba [build-utils] Add flag to indicate that a custom runtime supports lambda wrappers (#8324)
### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-08-04 17:23:55 +00:00
Steven
63830d38ce [cli] Fix vc secret rm (#8320)
Fixes a regression from the the refactor in https://github.com/vercel/vercel/pull/8039 that was causing the following error:

```
Error! client.prompt is not a function
```
2022-08-04 13:01:01 -04:00
JJ Kasper
f3428dd212 [next] Remove old middleware test (#8323)
Remove old middleware test
2022-08-04 11:57:43 -05:00
Steven
5eb8b16cbd Publish Stable
- @vercel/build-utils@5.1.1
 - vercel@27.3.6
 - @vercel/client@12.1.9
 - @vercel/edge@0.0.2
 - @vercel/frameworks@1.1.2
 - @vercel/fs-detectors@2.0.4
 - @vercel/go@2.0.14
 - @vercel/hydrogen@0.0.11
 - @vercel/next@3.1.14
 - @vercel/node@2.5.5
 - @vercel/python@3.1.6
 - @vercel/redwood@1.0.15
 - @vercel/remix@1.0.16
 - @vercel/routing-utils@2.0.1
 - @vercel/ruby@1.3.22
 - @vercel/static-build@1.0.15
 - @vercel/static-config@2.0.2
2022-08-04 11:39:58 -04:00
JJ Kasper
226bf02be2 [next] Remove middleware regexp modifying (#8321)
x-ref: [slack thread](https://vercel.slack.com/archives/C03SF65BYSG/p1659626639087909)
2022-08-04 11:39:21 -04:00
Steven
8505872f55 [tests] Update package.json scripts (#8318)
This PR consolidates all the `test` scripts to be the same and removes the `prepublishOnly` script since we always run `build` before publishing to npm.
2022-08-04 11:02:56 -04:00
Steven
7db6436797 Publish Stable
- @vercel/build-utils@5.1.0
 - vercel@27.3.5
 - @vercel/client@12.1.8
 - @vercel/go@2.0.13
 - @vercel/hydrogen@0.0.10
 - @vercel/next@3.1.13
 - @vercel/node@2.5.4
 - @vercel/python@3.1.5
 - @vercel/redwood@1.0.14
 - @vercel/remix@1.0.15
 - @vercel/ruby@1.3.21
 - @vercel/static-build@1.0.14
2022-08-04 08:37:04 -04:00
Chris Barber
e2d76e9c92 [cli] recreate symlinked files instead of copying (#8270)
Instead of copying symlinked files during a build, recreate the symlink.

### 📋 Checklist

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-08-04 04:07:33 +00:00
Nathan Rajlich
337cb21d67 [cli] Fix flaky error in builds.json tests (#8316)
Follow-up to #8305.

The `expect().toOutput()` call was frequently timing out before the
command writes the error to the terminal, causing the test to fail.
So wait for the command to return the exit code before running assertion
on the printed output.
2022-08-03 20:38:45 -07:00
JJ Kasper
6bfff3e9eb [next] Remove un-necessary duplicate i18n route (#8313)
* Remove un-necessary duplicate i18n route

* update
2022-08-03 21:02:19 -05:00
Sean Massa
ac5b259c11 [go] refactor away use of downloaded result (#8291)
Review Notes: Turn off diff whitespace.

### Refactor

This PR refactors away the use of the result of `download` for a couple of reasons:
- Keeping `files` and `downloadedFiles` in sync with the file system (like when we rename a file that starts with a bracket) is easy to forget to do, causing bugs
- Nate says that `files` is something we've wanted to move away from using anyway
- It simplifies the code in a few places
- It was getting in the way of other fixes that need to be made

We do still call `download`, but it should be a no-op most of the time.

As a consequence of these changes, this PR also addresses:
- the builder no longer leaves build artifacts around, in many cases
- the builder can compile files that start with brackets again; routes don't seem to allow this to file to respond to a dynamic segments yet, though

### Next Steps

Upcoming PRs will resolve builder issues:

- bracket endpoints responding to dynamic segments
- exported function name conflict handling
- compilation targets should only apply to the source code build, not the analyze go utility

### Operating In-place

We also now have a cleanup step that clears out created files, created directories, and undoes file renames. This fixes an issue where multiple builds on the same directory would fail. It also cleans the user's project code when they are using `vc build`.

Ideally, we'd probably copy all of the code to a separate location, then freely do filesystem operations there. It's not clear to me if this is preferred for large projects because it would have to happen once per endpoint: 100 Go files would cause 10,000 (100 * 100) file copies (or symlinks). 

It has to copy once per endpoint because we potentially need all of the files around in case any of them are imported. If we had nft-style tracing for Go, we could copy only what we needed.

This gets more complex in the next step where the exported function names will be renamed during compilation to fix the name conflict issue.
2022-08-04 01:11:35 +00:00
Steven
bfc553db11 [tests] Add support for probes.json (#8279)
Previously, our test fixtures used to use a probes prop in `vercel.json` that was removed right before it was deployed.

This PR allows a separate `probes.json` file with the same content to separate the test fixture input from the test probes.

This allows us to test real "zero config" deployments without a `vercel.json` file.
2022-08-03 20:15:50 -04:00
Nathan Rajlich
2b101d4692 [cli] Remove legacy config file migration logic (#8199)
Removes the legacy config file migration logic from back in the days when Zeit CLI supported multiple "providers". This was from a _very_ long time ago and we should expect that anyone who would have migrated at this point, has.
2022-08-03 23:48:17 +00:00
Matthew Stanciu
3316f38cb4 [cli] Strip scheme from vc inspect argument (#8307)
Right now, `vc inspect` fails to find a deployment if you include `http://` before it. But it works with no scheme and with `https://`.

Since it appears no scheme is what the API looks for anyway, and to avoid confusion, this PR strips any included scheme from the `deploymentIdOrHost` argument.

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
2022-08-03 22:48:44 +00:00
Nathan Rajlich
7837387127 [cli] Print error from Builder in vc build (#8305)
Ensures that errors that are serialized into `builds.json` are also printed to the terminal when running the `vc build` command.

Co-authored-by: Steven <steven@ceriously.com>
2022-08-03 17:32:50 -04:00
Thomas Knickman
f478200dd3 [static-build] set TURBO_CI_VENDOR_ENV_KEY environment variable (#8306)
Set `TURBO_CI_VENDOR_ENV_KEY` to support https://github.com/vercel/turborepo/pull/1622
2022-08-03 17:31:18 -04:00
Matthew Stanciu
c29de8206a [cli] Minor vc env pull diff formatting changes (#8303)
#8170 added a new message at the end of `vc env pull` which shows a delta of what was added, modified, and removed. Some people shared feedback that the yellow chalk color and `~` prefix to indicate modified variables was confusing. This PR instead keeps the prefix as `+` with a green color, but adds a `(Modified)` suffix at the end of every modified variable.

<img width="638" alt="Screen Shot 2022-08-03 at 10 18 28 AM" src="https://user-images.githubusercontent.com/14811170/182670327-5a3df6db-d84d-40a1-956b-9cf159501759.png">

### 📋 Checklist

<!--
  Please keep your PR as a Draft until the checklist is complete
-->

#### Tests

- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`

#### Code Review

- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
2022-08-03 20:04:03 +00:00
JJ Kasper
a2df3b5463 [next] Update data route handling for i18n and static routes (#8304)
* Ensure dynamic data route handles missing default locale path

* Ensure static data routes are still handled
2022-08-03 13:40:00 -05:00
Steven
73446e544a [python] Fix error message for discontinued Python 3.6 (#8300)
This fixes the error message when a discontinued version of python (for example, Python 3.6) is detected.

https://vercel.com/changelog/python-3-6-is-being-deprecated
2022-08-03 13:09:03 -04:00
JJ Kasper
21ff4a58c3 [next] Ensure we resolve _next/data dynamic routes correctly with i18n (#8297)
* Ensure we resolve _next/data dynamic routes correctly with i18n

* remove test version
2022-08-03 09:03:15 -05:00
JJ Kasper
2b9eb02b8c [next] Fix _next/data resolving priority for dynamic routes (#8278)
* Fix _next/data resolving priority for dynamic routes

* Apply suggestions from code review

* Ensure we match middleware for _next/data without header

* fix nested middleware case

* Update data routes generating

* Add version lock for non-nested middleware

* use path.posix
2022-08-02 17:41:06 -05:00
JJ Kasper
4ef4722460 [next] Fix priority for notFound preview routes (#7902)
Fix priority for notFound preview routes
2022-08-02 17:09:09 -05:00
Sean Massa
be5308b137 [dev] log middleware errors in vc dev (#8267)
Middleware server setup wasn't logging errors the same way that dev server setup was. This meant that middleware instantiation errors (like invalid config) would cause requests to 500, but no errors to be logged to the console.

This PR updates the invalid config error, makes sure errors in this area are logged out, and adds a test for this behavior.

**It may be appropriate to fail the deploy (and crash `vc dev`) in this case instead, though. What do others think?**

---

During `vc dev` with middleware that has an invalid `config.matcher` value...

Before: You see a 500 response in the browser and no output in the terminal.

After: You see a 500 response in the browser and this output in the terminal:

```
Error! Middleware's `config.matcher` values must start with "/". Received: not-a-valid-matcher
```

---

Related Issue: https://github.com/vercel/edge-functions/issues/220
2022-08-02 20:01:42 +00:00
Steven
08a83a94f8 [docs] Link to Build Output API docs (#8292)
* [docs] Link to Build Output API docs

Co-authored-by: Sean Massa <EndangeredMassa@gmail.com>
2022-08-02 12:35:36 -04:00
Steven
543ffdfe5c Publish Stable
- @vercel/build-utils@5.0.8
 - vercel@27.3.4
 - @vercel/client@12.1.7
 - @vercel/fs-detectors@2.0.3
 - @vercel/go@2.0.12
 - @vercel/hydrogen@0.0.9
 - @vercel/next@3.1.12
 - @vercel/node@2.5.3
 - @vercel/python@3.1.4
 - @vercel/redwood@1.0.13
 - @vercel/remix@1.0.14
 - @vercel/ruby@1.3.20
 - @vercel/static-build@1.0.13
2022-08-02 09:29:15 -04:00
Steven
c11527e904 [build-utils] Fix symlink on download (#8288)
Some builders, such as `@vercel/next`, return both the symlinked directory and the resolved file.

When `vc build` iterates over the files to recreate them in `.vercel/output`, it fails with `EEXIST: file already exists` when creating the symlink because it first creates the file `node_modules/<symlink>/package.json` and then attempts to create the symlink `node_modules/<symlink>`.

This happened to work before `vc build` because yazl would accept the symlinked directory instead of the package.json file, so this PR is created to match that behavior.
2022-08-01 23:35:24 +00:00
Sean Massa
d296064386 [go] remove meta.isDev references and restore partial tests (#8287)
Remove `meta.isDev` checks inside the `build` function because it's never set there. Instead, `startDevServer` would be used.

Also restored the Go tests in a partial form. Will fix the Go builder issues and make sure those features are tested completely in follow-up PRs.
2022-08-01 19:52:34 +00:00
Lee Robinson
400a6c42bd [examples] Update Docusaurus 2 template for stable (#8286) 2022-08-01 18:48:24 +00:00
Seiya Nuta
71b3ded398 [fs-detectors] Exclude the middleware builder if it's a Next.js app (#8239) 2022-08-01 08:18:29 -04:00
Sean Massa
fc3fa61b59 update edge-runtime to allow instanceof to work with primitives (#8242)
* update edge-runtime to allow `instanceof` to work with primitives

* finishin upgrading edge-runtime

* update to latest

* fix merge

* remove dev only from test
2022-07-29 16:11:37 -05:00
Sean Massa
1f98c4fee7 Publish Stable
- vercel@27.3.3
2022-07-29 14:20:13 -05:00
Matthew Stanciu
1cb5a91727 [cli] Fix env delta message (#8271)
Co-authored-by: Sean Massa <EndangeredMassa@gmail.com>
Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com>
2022-07-29 14:18:55 -05:00
Sean Massa
e8c7db59cf Publish Stable
- @vercel/build-utils@5.0.7
 - vercel@27.3.2
 - @vercel/client@12.1.6
 - @vercel/go@2.0.11
 - @vercel/hydrogen@0.0.8
 - @vercel/next@3.1.11
 - @vercel/node@2.5.2
 - @vercel/python@3.1.3
 - @vercel/redwood@1.0.12
 - @vercel/remix@1.0.13
 - @vercel/ruby@1.3.19
 - @vercel/static-build@1.0.12
2022-07-29 13:45:48 -05:00
Steven
57b230e25f [all] Revert ncc back to 0.24.0 (#8276) 2022-07-29 13:44:09 -05:00
Steven
ab3fb25790 [tests] Run dev e2e against same cli version (#8274)
The dev integration tests compare `vc dev` with a real deployment to make sure the results are the same. This PR ensures the deployment uses the same version of Vercel CLI as the local `vc dev` instance.

Co-authored-by: Sean Massa <EndangeredMassa@gmail.com>
2022-07-29 13:19:36 -04:00
Steven
88d98f7497 [cli] Fix vc build with legacy @now/static (#8273)
There was a special case for `@vercel/static` but vc build was missing a special case for the legacy `@now/static`, which should work the same way.
2022-07-29 11:40:09 -04:00
340 changed files with 7963 additions and 12405 deletions

View File

@@ -1,7 +1,9 @@
# Runtime Developer Reference
The following page is a reference for how to create a Runtime by implementing
the Runtime API interface.
the Runtime API interface. It's a way to add support for a new programming language to Vercel.
> Note: If you're the author of a web framework, please use the [Build Output API](https://vercel.com/docs/build-output-api/v3) instead to make your framework compatible with Vercel.
A Runtime is an npm module that implements the following interface:

View File

@@ -15,5 +15,5 @@ _Live Example: https://docusaurus-2-template.vercel.app_
To get started with Docusaurus on Vercel, you can use the [Docusaurus CLI](https://v2.docusaurus.io/docs/installation#scaffold-project-website) to initialize the project:
```shell
$ npx @docusaurus/init@next init my-website classic
npx create-docusaurus@latest my-website classic
```

View File

@@ -0,0 +1,3 @@
module.exports = {
presets: [require.resolve('@docusaurus/core/lib/babel/preset')],
};

View File

@@ -1,10 +1,11 @@
---
id: hola
title: Hola
author: Gao Wei
author_title: Docusaurus Core Team
author_url: https://github.com/wgao19
author_image_url: https://avatars1.githubusercontent.com/u/2055384?v=4
slug: first-blog-post
title: First Blog Post
authors:
name: Gao Wei
title: Docusaurus Core Team
url: https://github.com/wgao19
image_url: https://github.com/wgao19.png
tags: [hola, docusaurus]
---

View File

@@ -1,17 +0,0 @@
---
id: hello-world
title: Hello
author: Endilie Yacop Sucipto
author_title: Maintainer of Docusaurus
author_url: https://github.com/endiliey
author_image_url: https://avatars1.githubusercontent.com/u/17883920?s=460&v=4
tags: [hello, docusaurus]
---
Welcome to this blog. This blog is created with [**Docusaurus 2 alpha**](https://v2.docusaurus.io/).
<!--truncate-->
This is a test post.
A whole bunch of other information.

View File

@@ -0,0 +1,44 @@
---
slug: long-blog-post
title: Long Blog Post
authors: endi
tags: [hello, docusaurus]
---
This is the summary of a very long blog post,
Use a `<!--` `truncate` `-->` comment to limit blog post size in the list view.
<!--truncate-->
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet

View File

@@ -1,13 +0,0 @@
---
id: welcome
title: Welcome
author: Yangshun Tay
author_title: Front End Engineer @ Facebook
author_url: https://github.com/yangshun
author_image_url: https://avatars0.githubusercontent.com/u/1315101?s=400&v=4
tags: [facebook, hello, docusaurus]
---
Blog features are powered by the blog plugin. Simply add files to the `blog` directory. It supports tags as well!
Delete the whole directory if you don't want the blog features. As simple as that!

View File

@@ -0,0 +1,20 @@
---
slug: mdx-blog-post
title: MDX Blog Post
authors: [slorber]
tags: [docusaurus]
---
Blog posts support [Docusaurus Markdown features](https://docusaurus.io/docs/markdown-features), such as [MDX](https://mdxjs.com/).
:::tip
Use the power of React to create interactive blog posts.
```js
<button onClick={() => alert('button clicked!')}>Click me!</button>
```
<button onClick={() => alert('button clicked!')}>Click me!</button>
:::

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

View File

@@ -0,0 +1,25 @@
---
slug: welcome
title: Welcome
authors: [slorber, yangshun]
tags: [facebook, hello, docusaurus]
---
[Docusaurus blogging features](https://docusaurus.io/docs/blog) are powered by the [blog plugin](https://docusaurus.io/docs/api/plugins/@docusaurus/plugin-content-blog).
Simply add Markdown files (or folders) to the `blog` directory.
Regular blog authors can be added to `authors.yml`.
The blog post date can be extracted from filenames, such as:
- `2019-05-30-welcome.md`
- `2019-05-30-welcome/index.md`
A blog post folder can be convenient to co-locate blog post images:
![Docusaurus Plushie](./docusaurus-plushie-banner.jpeg)
The blog supports tags as well!
**And if you don't want a blog**: just delete this directory, and use `blog: false` in your Docusaurus config.

View File

@@ -0,0 +1,17 @@
endi:
name: Endilie Yacop Sucipto
title: Maintainer of Docusaurus
url: https://github.com/endiliey
image_url: https://github.com/endiliey.png
yangshun:
name: Yangshun Tay
title: Front End Engineer @ Facebook
url: https://github.com/yangshun
image_url: https://github.com/yangshun.png
slorber:
name: Sébastien Lorber
title: Docusaurus maintainer
url: https://sebastienlorber.com
image_url: https://github.com/slorber.png

View File

@@ -1,202 +0,0 @@
---
id: doc1
title: Style Guide
sidebar_label: Style Guide
---
You can write content using [GitHub-flavored Markdown syntax](https://github.github.com/gfm/).
## Markdown Syntax
To serve as an example page when styling markdown based Docusaurus sites.
## Headers
# H1 - Create the best documentation
## H2 - Create the best documentation
### H3 - Create the best documentation
#### H4 - Create the best documentation
##### H5 - Create the best documentation
###### H6 - Create the best documentation
---
## Emphasis
Emphasis, aka italics, with _asterisks_ or _underscores_.
Strong emphasis, aka bold, with **asterisks** or **underscores**.
Combined emphasis with **asterisks and _underscores_**.
Strikethrough uses two tildes. ~~Scratch this.~~
---
## Lists
1. First ordered list item
1. Another item ⋅⋅\* Unordered sub-list.
1. Actual numbers don't matter, just that it's a number ⋅⋅1. Ordered sub-list
1. And another item.
⋅⋅⋅You can have properly indented paragraphs within list items. Notice the blank line above, and the leading spaces (at least one, but we'll use three here to also align the raw Markdown).
⋅⋅⋅To have a line break without a paragraph, you will need to use two trailing spaces.⋅⋅ ⋅⋅⋅Note that this line is separate, but within the same paragraph.⋅⋅ ⋅⋅⋅(This is contrary to the typical GFM line break behaviour, where trailing spaces are not required.)
- Unordered list can use asterisks
* Or minuses
- Or pluses
---
## Links
[I'm an inline-style link](https://www.google.com)
[I'm an inline-style link with title](https://www.google.com "Google's Homepage")
[I'm a reference-style link][arbitrary case-insensitive reference text]
[I'm a relative reference to a repository file](../blob/master/LICENSE)
[You can use numbers for reference-style link definitions][1]
Or leave it empty and use the [link text itself].
URLs and URLs in angle brackets will automatically get turned into links. http://www.example.com or <http://www.example.com> and sometimes example.com (but not on Github, for example).
Some text to show that the reference links can follow later.
[arbitrary case-insensitive reference text]: https://www.mozilla.org
[1]: http://slashdot.org
[link text itself]: http://www.reddit.com
---
## Images
Here's our logo (hover to see the title text):
Inline-style: ![alt text](https://github.com/adam-p/markdown-here/raw/master/src/common/images/icon48.png 'Logo Title Text 1')
Reference-style: ![alt text][logo]
[logo]: https://github.com/adam-p/markdown-here/raw/master/src/common/images/icon48.png 'Logo Title Text 2'
---
## Code
```javascript
var s = 'JavaScript syntax highlighting';
alert(s);
```
```python
s = "Python syntax highlighting"
print(s)
```
```
No language indicated, so no syntax highlighting.
But let's throw in a <b>tag</b>.
```
```js {2}
function highlightMe() {
console.log('This line can be highlighted!');
}
```
---
## Tables
Colons can be used to align columns.
| Tables | Are | Cool |
| ------------- | :-----------: | -----: |
| col 3 is | right-aligned | \$1600 |
| col 2 is | centered | \$12 |
| zebra stripes | are neat | \$1 |
There must be at least 3 dashes separating each header cell. The outer pipes (|) are optional, and you don't need to make the raw Markdown line up prettily. You can also use inline Markdown.
| Markdown | Less | Pretty |
| -------- | --------- | ---------- |
| _Still_ | `renders` | **nicely** |
| 1 | 2 | 3 |
---
## Blockquotes
> Blockquotes are very handy in email to emulate reply text. This line is part of the same quote.
Quote break.
> This is a very long line that will still be quoted properly when it wraps. Oh boy let's keep writing to make sure this is long enough to actually wrap for everyone. Oh, you can _put_ **Markdown** into a blockquote.
---
## Inline HTML
<dl>
<dt>Definition list</dt>
<dd>Is something people use sometimes.</dd>
<dt>Markdown in HTML</dt>
<dd>Does *not* work **very** well. Use HTML <em>tags</em>.</dd>
</dl>
---
## Line Breaks
Here's a line for us to start with.
This line is separated from the one above by two newlines, so it will be a _separate paragraph_.
This line is also a separate paragraph, but... This line is only separated by a single newline, so it's a separate line in the _same paragraph_.
---
## Admonitions
:::note
This is a note
:::
:::tip
This is a tip
:::
:::important
This is important
:::
:::caution
This is a caution
:::
:::warning
This is a warning
:::

View File

@@ -1,6 +0,0 @@
---
id: doc2
title: Document Number 2
---
This is a link to [another document.](doc3.md) This is a link to an [external page.](http://www.example.com)

View File

@@ -1,14 +0,0 @@
---
id: doc3
title: This is Document Number 3
---
Lorem ipsum dolor sit amet, consectetur adipiscing elit. In ac euismod odio, eu consequat dui. Nullam molestie consectetur risus id imperdiet. Proin sodales ornare turpis, non mollis massa ultricies id. Nam at nibh scelerisque, feugiat ante non, dapibus tortor. Vivamus volutpat diam quis tellus elementum bibendum. Praesent semper gravida velit quis aliquam. Etiam in cursus neque. Nam lectus ligula, malesuada et mauris a, bibendum faucibus mi. Phasellus ut interdum felis. Phasellus in odio pulvinar, porttitor urna eget, fringilla lectus. Aliquam sollicitudin est eros. Mauris consectetur quam vitae mauris interdum hendrerit. Lorem ipsum dolor sit amet, consectetur adipiscing elit.
Duis et egestas libero, imperdiet faucibus ipsum. Sed posuere eget urna vel feugiat. Vivamus a arcu sagittis, fermentum urna dapibus, congue lectus. Fusce vulputate porttitor nisl, ac cursus elit volutpat vitae. Nullam vitae ipsum egestas, convallis quam non, porta nibh. Morbi gravida erat nec neque bibendum, eu pellentesque velit posuere. Fusce aliquam erat eu massa eleifend tristique.
Sed consequat sollicitudin ipsum eget tempus. Integer a aliquet velit. In justo nibh, pellentesque non suscipit eget, gravida vel lacus. Donec odio ante, malesuada in massa quis, pharetra tristique ligula. Donec eros est, tristique eget finibus quis, semper non nisl. Vivamus et elit nec enim ornare placerat. Sed posuere odio a elit cursus sagittis.
Phasellus feugiat purus eu tortor ultrices finibus. Ut libero nibh, lobortis et libero nec, dapibus posuere eros. Sed sagittis euismod justo at consectetur. Nulla finibus libero placerat, cursus sapien at, eleifend ligula. Vivamus elit nisl, hendrerit ac nibh eu, ultrices tempus dui. Nam tellus neque, commodo non rhoncus eu, gravida in risus. Nullam id iaculis tortor.
Nullam at odio in sem varius tempor sit amet vel lorem. Etiam eu hendrerit nisl. Fusce nibh mauris, vulputate sit amet ex vitae, congue rhoncus nisl. Sed eget tellus purus. Nullam tempus commodo erat ut tristique. Cras accumsan massa sit amet justo consequat eleifend. Integer scelerisque vitae tellus id consectetur.

View File

@@ -0,0 +1,47 @@
---
sidebar_position: 1
---
# Tutorial Intro
Let's discover **Docusaurus in less than 5 minutes**.
## Getting Started
Get started by **creating a new site**.
Or **try Docusaurus immediately** with **[docusaurus.new](https://docusaurus.new)**.
### What you'll need
- [Node.js](https://nodejs.org/en/download/) version 16.14 or above:
- When installing Node.js, you are recommended to check all checkboxes related to dependencies.
## Generate a new site
Generate a new Docusaurus site using the **classic template**.
The classic template will automatically be added to your project after you run the command:
```bash
npm init docusaurus@latest my-website classic
```
You can type this command into Command Prompt, Powershell, Terminal, or any other integrated terminal of your code editor.
The command also installs all necessary dependencies you need to run Docusaurus.
## Start your site
Run the development server:
```bash
cd my-website
npm run start
```
The `cd` command changes the directory you're working with. In order to work with your newly created Docusaurus site, you'll need to navigate the terminal there.
The `npm run start` command builds your website locally and serves it through a development server, ready for you to view at http://localhost:3000/.
Open `docs/intro.md` (this page) and edit some lines: the site **reloads automatically** and displays your changes.

View File

@@ -1,17 +0,0 @@
---
id: mdx
title: Powered by MDX
---
You can write JSX and use React components within your Markdown thanks to [MDX](https://mdxjs.com/).
export const Highlight = ({children, color}) => ( <span style={{
backgroundColor: color,
borderRadius: '2px',
color: '#fff',
padding: '0.2rem',
}}> {children} </span> );
<Highlight color="#25c2a0">Docusaurus green</Highlight> and <Highlight color="#1877F2">Facebook blue</Highlight> are my favorite colors.
I can write **Markdown** alongside my _JSX_!

View File

@@ -0,0 +1,8 @@
{
"label": "Tutorial - Basics",
"position": 2,
"link": {
"type": "generated-index",
"description": "5 minutes to learn the most important Docusaurus concepts."
}
}

View File

@@ -0,0 +1,21 @@
---
sidebar_position: 6
---
# Congratulations!
You have just learned the **basics of Docusaurus** and made some changes to the **initial template**.
Docusaurus has **much more to offer**!
Have **5 more minutes**? Take a look at **[versioning](../tutorial-extras/manage-docs-versions.md)** and **[i18n](../tutorial-extras/translate-your-site.md)**.
Anything **unclear** or **buggy** in this tutorial? [Please report it!](https://github.com/facebook/docusaurus/discussions/4610)
## What's next?
- Read the [official documentation](https://docusaurus.io/).
- Add a custom [Design and Layout](https://docusaurus.io/docs/styling-layout)
- Add a [search bar](https://docusaurus.io/docs/search)
- Find inspirations in the [Docusaurus showcase](https://docusaurus.io/showcase)
- Get involved in the [Docusaurus Community](https://docusaurus.io/community/support)

View File

@@ -0,0 +1,34 @@
---
sidebar_position: 3
---
# Create a Blog Post
Docusaurus creates a **page for each blog post**, but also a **blog index page**, a **tag system**, an **RSS** feed...
## Create your first Post
Create a file at `blog/2021-02-28-greetings.md`:
```md title="blog/2021-02-28-greetings.md"
---
slug: greetings
title: Greetings!
authors:
- name: Joel Marcey
title: Co-creator of Docusaurus 1
url: https://github.com/JoelMarcey
image_url: https://github.com/JoelMarcey.png
- name: Sébastien Lorber
title: Docusaurus maintainer
url: https://sebastienlorber.com
image_url: https://github.com/slorber.png
tags: [greetings]
---
Congratulations, you have made your first post!
Feel free to play around and edit this post as much you like.
```
A new blog post is now available at [http://localhost:3000/blog/greetings](http://localhost:3000/blog/greetings).

View File

@@ -0,0 +1,55 @@
---
sidebar_position: 2
---
# Create a Document
Documents are **groups of pages** connected through:
- a **sidebar**
- **previous/next navigation**
- **versioning**
## Create your first Doc
Create a Markdown file at `docs/hello.md`:
```md title="docs/hello.md"
# Hello
This is my **first Docusaurus document**!
```
A new document is now available at [http://localhost:3000/docs/hello](http://localhost:3000/docs/hello).
## Configure the Sidebar
Docusaurus automatically **creates a sidebar** from the `docs` folder.
Add metadata to customize the sidebar label and position:
```md title="docs/hello.md" {1-4}
---
sidebar_label: 'Hi!'
sidebar_position: 3
---
# Hello
This is my **first Docusaurus document**!
```
It is also possible to create your sidebar explicitly in `sidebars.js`:
```js title="sidebars.js"
module.exports = {
tutorialSidebar: [
{
type: 'category',
label: 'Tutorial',
// highlight-next-line
items: ['hello'],
},
],
};
```

View File

@@ -0,0 +1,43 @@
---
sidebar_position: 1
---
# Create a Page
Add **Markdown or React** files to `src/pages` to create a **standalone page**:
- `src/pages/index.js``localhost:3000/`
- `src/pages/foo.md``localhost:3000/foo`
- `src/pages/foo/bar.js``localhost:3000/foo/bar`
## Create your first React Page
Create a file at `src/pages/my-react-page.js`:
```jsx title="src/pages/my-react-page.js"
import React from 'react';
import Layout from '@theme/Layout';
export default function MyReactPage() {
return (
<Layout>
<h1>My React page</h1>
<p>This is a React page</p>
</Layout>
);
}
```
A new page is now available at [http://localhost:3000/my-react-page](http://localhost:3000/my-react-page).
## Create your first Markdown Page
Create a file at `src/pages/my-markdown-page.md`:
```mdx title="src/pages/my-markdown-page.md"
# My Markdown page
This is a Markdown page
```
A new page is now available at [http://localhost:3000/my-markdown-page](http://localhost:3000/my-markdown-page).

View File

@@ -0,0 +1,31 @@
---
sidebar_position: 5
---
# Deploy your site
Docusaurus is a **static-site-generator** (also called **[Jamstack](https://jamstack.org/)**).
It builds your site as simple **static HTML, JavaScript and CSS files**.
## Build your site
Build your site **for production**:
```bash
npm run build
```
The static files are generated in the `build` folder.
## Deploy your site
Test your production build locally:
```bash
npm run serve
```
The `build` folder is now served at [http://localhost:3000/](http://localhost:3000/).
You can now deploy the `build` folder **almost anywhere** easily, **for free** or very small cost (read the **[Deployment Guide](https://docusaurus.io/docs/deployment)**).

View File

@@ -0,0 +1,146 @@
---
sidebar_position: 4
---
# Markdown Features
Docusaurus supports **[Markdown](https://daringfireball.net/projects/markdown/syntax)** and a few **additional features**.
## Front Matter
Markdown documents have metadata at the top called [Front Matter](https://jekyllrb.com/docs/front-matter/):
```text title="my-doc.md"
// highlight-start
---
id: my-doc-id
title: My document title
description: My document description
slug: /my-custom-url
---
// highlight-end
## Markdown heading
Markdown text with [links](./hello.md)
```
## Links
Regular Markdown links are supported, using url paths or relative file paths.
```md
Let's see how to [Create a page](/create-a-page).
```
```md
Let's see how to [Create a page](./create-a-page.md).
```
**Result:** Let's see how to [Create a page](./create-a-page.md).
## Images
Regular Markdown images are supported.
You can use absolute paths to reference images in the static directory (`static/img/docusaurus.png`):
```md
![Docusaurus logo](/img/docusaurus.png)
```
![Docusaurus logo](/img/docusaurus.png)
You can reference images relative to the current file as well, as shown in [the extra guides](../tutorial-extras/manage-docs-versions.md).
## Code Blocks
Markdown code blocks are supported with Syntax highlighting.
```jsx title="src/components/HelloDocusaurus.js"
function HelloDocusaurus() {
return (
<h1>Hello, Docusaurus!</h1>
)
}
```
```jsx title="src/components/HelloDocusaurus.js"
function HelloDocusaurus() {
return <h1>Hello, Docusaurus!</h1>;
}
```
## Admonitions
Docusaurus has a special syntax to create admonitions and callouts:
:::tip My tip
Use this awesome feature option
:::
:::danger Take care
This action is dangerous
:::
:::tip My tip
Use this awesome feature option
:::
:::danger Take care
This action is dangerous
:::
## MDX and React Components
[MDX](https://mdxjs.com/) can make your documentation more **interactive** and allows using any **React components inside Markdown**:
```jsx
export const Highlight = ({children, color}) => (
<span
style={{
backgroundColor: color,
borderRadius: '20px',
color: '#fff',
padding: '10px',
cursor: 'pointer',
}}
onClick={() => {
alert(`You clicked the color ${color} with label ${children}`)
}}>
{children}
</span>
);
This is <Highlight color="#25c2a0">Docusaurus green</Highlight> !
This is <Highlight color="#1877F2">Facebook blue</Highlight> !
```
export const Highlight = ({children, color}) => (
<span
style={{
backgroundColor: color,
borderRadius: '20px',
color: '#fff',
padding: '10px',
cursor: 'pointer',
}}
onClick={() => {
alert(`You clicked the color ${color} with label ${children}`);
}}>
{children}
</span>
);
This is <Highlight color="#25c2a0">Docusaurus green</Highlight> !
This is <Highlight color="#1877F2">Facebook blue</Highlight> !

View File

@@ -0,0 +1,7 @@
{
"label": "Tutorial - Extras",
"position": 3,
"link": {
"type": "generated-index"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

View File

@@ -0,0 +1,55 @@
---
sidebar_position: 1
---
# Manage Docs Versions
Docusaurus can manage multiple versions of your docs.
## Create a docs version
Release a version 1.0 of your project:
```bash
npm run docusaurus docs:version 1.0
```
The `docs` folder is copied into `versioned_docs/version-1.0` and `versions.json` is created.
Your docs now have 2 versions:
- `1.0` at `http://localhost:3000/docs/` for the version 1.0 docs
- `current` at `http://localhost:3000/docs/next/` for the **upcoming, unreleased docs**
## Add a Version Dropdown
To navigate seamlessly across versions, add a version dropdown.
Modify the `docusaurus.config.js` file:
```js title="docusaurus.config.js"
module.exports = {
themeConfig: {
navbar: {
items: [
// highlight-start
{
type: 'docsVersionDropdown',
},
// highlight-end
],
},
},
};
```
The docs version dropdown appears in your navbar:
![Docs Version Dropdown](./img/docsVersionDropdown.png)
## Update an existing version
It is possible to edit versioned docs in their respective folder:
- `versioned_docs/version-1.0/hello.md` updates `http://localhost:3000/docs/hello`
- `docs/hello.md` updates `http://localhost:3000/docs/next/hello`

View File

@@ -0,0 +1,88 @@
---
sidebar_position: 2
---
# Translate your site
Let's translate `docs/intro.md` to French.
## Configure i18n
Modify `docusaurus.config.js` to add support for the `fr` locale:
```js title="docusaurus.config.js"
module.exports = {
i18n: {
defaultLocale: 'en',
locales: ['en', 'fr'],
},
};
```
## Translate a doc
Copy the `docs/intro.md` file to the `i18n/fr` folder:
```bash
mkdir -p i18n/fr/docusaurus-plugin-content-docs/current/
cp docs/intro.md i18n/fr/docusaurus-plugin-content-docs/current/intro.md
```
Translate `i18n/fr/docusaurus-plugin-content-docs/current/intro.md` in French.
## Start your localized site
Start your site on the French locale:
```bash
npm run start -- --locale fr
```
Your localized site is accessible at [http://localhost:3000/fr/](http://localhost:3000/fr/) and the `Getting Started` page is translated.
:::caution
In development, you can only use one locale at a same time.
:::
## Add a Locale Dropdown
To navigate seamlessly across languages, add a locale dropdown.
Modify the `docusaurus.config.js` file:
```js title="docusaurus.config.js"
module.exports = {
themeConfig: {
navbar: {
items: [
// highlight-start
{
type: 'localeDropdown',
},
// highlight-end
],
},
},
};
```
The locale dropdown now appears in your navbar:
![Locale Dropdown](./img/localeDropdown.png)
## Build your localized site
Build your site for a specific locale:
```bash
npm run build -- --locale fr
```
Or build your site to include all the locales at once:
```bash
npm run build
```

View File

@@ -1,103 +1,132 @@
module.exports = {
// @ts-check
// Note: type annotations allow type checking and IDEs autocompletion
const lightCodeTheme = require('prism-react-renderer/themes/github');
const darkCodeTheme = require('prism-react-renderer/themes/dracula');
/** @type {import('@docusaurus/types').Config} */
const config = {
title: 'My Site',
tagline: 'The tagline of my site',
tagline: 'Dinosaurs are cool',
url: 'https://your-docusaurus-test-site.com',
baseUrl: '/',
onBrokenLinks: 'throw',
onBrokenMarkdownLinks: 'warn',
favicon: 'img/favicon.ico',
// GitHub pages deployment config.
// If you aren't using GitHub pages, you don't need these.
organizationName: 'facebook', // Usually your GitHub org/user name.
projectName: 'docusaurus', // Usually your repo name.
themeConfig: {
navbar: {
title: 'My Site',
logo: {
alt: 'My Site Logo',
src: 'img/logo.svg',
},
links: [
{
to: 'docs/doc1',
activeBasePath: 'docs',
label: 'Docs',
position: 'left',
},
{to: 'blog', label: 'Blog', position: 'left'},
{
href: 'https://github.com/facebook/docusaurus',
label: 'GitHub',
position: 'right',
},
],
},
footer: {
style: 'dark',
links: [
{
title: 'Docs',
items: [
{
label: 'Style Guide',
to: 'docs/doc1',
},
{
label: 'Second Doc',
to: 'docs/doc2',
},
],
},
{
title: 'Community',
items: [
{
label: 'Stack Overflow',
href: 'https://stackoverflow.com/questions/tagged/docusaurus',
},
{
label: 'Discord',
href: 'https://discordapp.com/invite/docusaurus',
},
{
label: 'Twitter',
href: 'https://twitter.com/docusaurus',
},
],
},
{
title: 'More',
items: [
{
label: 'Blog',
to: 'blog',
},
{
label: 'GitHub',
href: 'https://github.com/facebook/docusaurus',
},
],
},
],
copyright: `Copyright © ${new Date().getFullYear()} My Project, Inc. Built with Docusaurus.`,
},
// Even if you don't use internalization, you can use this field to set useful
// metadata like html lang. For example, if your site is Chinese, you may want
// to replace "en" with "zh-Hans".
i18n: {
defaultLocale: 'en',
locales: ['en'],
},
presets: [
[
'@docusaurus/preset-classic',
{
'classic',
/** @type {import('@docusaurus/preset-classic').Options} */
({
docs: {
sidebarPath: require.resolve('./sidebars.js'),
// Please change this to your repo.
// Remove this to remove the "edit this page" links.
editUrl:
'https://github.com/facebook/docusaurus/edit/master/website/',
'https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/',
},
blog: {
showReadingTime: true,
// Please change this to your repo.
// Remove this to remove the "edit this page" links.
editUrl:
'https://github.com/facebook/docusaurus/edit/master/website/blog/',
'https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/',
},
theme: {
customCss: require.resolve('./src/css/custom.css'),
},
},
}),
],
],
themeConfig:
/** @type {import('@docusaurus/preset-classic').ThemeConfig} */
({
navbar: {
title: 'My Site',
logo: {
alt: 'My Site Logo',
src: 'img/logo.svg',
},
items: [
{
type: 'doc',
docId: 'intro',
position: 'left',
label: 'Tutorial',
},
{to: '/blog', label: 'Blog', position: 'left'},
{
href: 'https://github.com/facebook/docusaurus',
label: 'GitHub',
position: 'right',
},
],
},
footer: {
style: 'dark',
links: [
{
title: 'Docs',
items: [
{
label: 'Tutorial',
to: '/docs/intro',
},
],
},
{
title: 'Community',
items: [
{
label: 'Stack Overflow',
href: 'https://stackoverflow.com/questions/tagged/docusaurus',
},
{
label: 'Discord',
href: 'https://discordapp.com/invite/docusaurus',
},
{
label: 'Twitter',
href: 'https://twitter.com/docusaurus',
},
],
},
{
title: 'More',
items: [
{
label: 'Blog',
to: '/blog',
},
{
label: 'GitHub',
href: 'https://github.com/facebook/docusaurus',
},
],
},
],
copyright: `Copyright © ${new Date().getFullYear()} My Project, Inc. Built with Docusaurus.`,
},
prism: {
theme: lightCodeTheme,
darkTheme: darkCodeTheme,
},
}),
};
module.exports = config;

View File

@@ -1,23 +1,31 @@
{
"name": "docusaurus-2",
"version": "0.0.0",
"private": true,
"scripts": {
"docusaurus": "docusaurus",
"start": "docusaurus start",
"build": "docusaurus build",
"swizzle": "docusaurus swizzle",
"deploy": "docusaurus deploy"
"deploy": "docusaurus deploy",
"clear": "docusaurus clear",
"serve": "docusaurus serve",
"write-translations": "docusaurus write-translations",
"write-heading-ids": "docusaurus write-heading-ids"
},
"dependencies": {
"@docusaurus/core": "^2.0.0-alpha.54",
"@docusaurus/preset-classic": "^2.0.0-alpha.54",
"classnames": "^2.2.6",
"react": "^16.8.4",
"react-dom": "^16.8.4"
"@docusaurus/core": "2.0.1",
"@docusaurus/preset-classic": "2.0.1",
"@mdx-js/react": "^1.6.22",
"clsx": "^1.2.1",
"prism-react-renderer": "^1.3.5",
"react": "^17.0.2",
"react-dom": "^17.0.2"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "2.0.1"
},
"browserslist": {
"production": [
">0.2%",
">0.5%",
"not dead",
"not op_mini all"
],
@@ -26,5 +34,8 @@
"last 1 firefox version",
"last 1 safari version"
]
},
"engines": {
"node": ">=16.14"
}
}

View File

@@ -1,6 +1,31 @@
module.exports = {
someSidebar: {
Docusaurus: ['doc1', 'doc2', 'doc3'],
Features: ['mdx'],
},
/**
* Creating a sidebar enables you to:
- create an ordered group of docs
- render a sidebar for each doc of that group
- provide next/previous navigation
The sidebars can be generated from the filesystem, or explicitly defined here.
Create as many sidebars as you want.
*/
// @ts-check
/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */
const sidebars = {
// By default, Docusaurus generates a sidebar from the docs folder structure
tutorialSidebar: [{type: 'autogenerated', dirName: '.'}],
// But you can create a sidebar manually
/*
tutorialSidebar: [
{
type: 'category',
label: 'Tutorial',
items: ['hello'],
},
],
*/
};
module.exports = sidebars;

View File

@@ -0,0 +1,64 @@
import React from 'react';
import clsx from 'clsx';
import styles from './styles.module.css';
const FeatureList = [
{
title: 'Easy to Use',
Svg: require('@site/static/img/undraw_docusaurus_mountain.svg').default,
description: (
<>
Docusaurus was designed from the ground up to be easily installed and
used to get your website up and running quickly.
</>
),
},
{
title: 'Focus on What Matters',
Svg: require('@site/static/img/undraw_docusaurus_tree.svg').default,
description: (
<>
Docusaurus lets you focus on your docs, and we&apos;ll do the chores. Go
ahead and move your docs into the <code>docs</code> directory.
</>
),
},
{
title: 'Powered by React',
Svg: require('@site/static/img/undraw_docusaurus_react.svg').default,
description: (
<>
Extend or customize your website layout by reusing React. Docusaurus can
be extended while reusing the same header and footer.
</>
),
},
];
function Feature({Svg, title, description}) {
return (
<div className={clsx('col col--4')}>
<div className="text--center">
<Svg className={styles.featureSvg} role="img" />
</div>
<div className="text--center padding-horiz--md">
<h3>{title}</h3>
<p>{description}</p>
</div>
</div>
);
}
export default function HomepageFeatures() {
return (
<section className={styles.features}>
<div className="container">
<div className="row">
{FeatureList.map((props, idx) => (
<Feature key={idx} {...props} />
))}
</div>
</div>
</section>
);
}

View File

@@ -0,0 +1,11 @@
.features {
display: flex;
align-items: center;
padding: 2rem 0;
width: 100%;
}
.featureSvg {
height: 200px;
width: 200px;
}

View File

@@ -1,4 +1,3 @@
/* stylelint-disable docusaurus/copyright-header */
/**
* Any CSS included here will be global. The classic template
* bundles Infima by default. Infima is a CSS framework designed to
@@ -7,19 +6,25 @@
/* You can override the default Infima variables here. */
:root {
--ifm-color-primary: #25c2a0;
--ifm-color-primary-dark: rgb(33, 175, 144);
--ifm-color-primary-darker: rgb(31, 165, 136);
--ifm-color-primary-darkest: rgb(26, 136, 112);
--ifm-color-primary-light: rgb(70, 203, 174);
--ifm-color-primary-lighter: rgb(102, 212, 189);
--ifm-color-primary-lightest: rgb(146, 224, 208);
--ifm-color-primary: #2e8555;
--ifm-color-primary-dark: #29784c;
--ifm-color-primary-darker: #277148;
--ifm-color-primary-darkest: #205d3b;
--ifm-color-primary-light: #33925d;
--ifm-color-primary-lighter: #359962;
--ifm-color-primary-lightest: #3cad6e;
--ifm-code-font-size: 95%;
--docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.1);
}
.docusaurus-highlight-code-line {
background-color: rgb(72, 77, 91);
display: block;
margin: 0 calc(-1 * var(--ifm-pre-padding));
padding: 0 var(--ifm-pre-padding);
/* For readability concerns, you should choose a lighter palette in dark mode. */
[data-theme='dark'] {
--ifm-color-primary: #25c2a0;
--ifm-color-primary-dark: #21af90;
--ifm-color-primary-darker: #1fa588;
--ifm-color-primary-darkest: #1a8870;
--ifm-color-primary-light: #29d5b0;
--ifm-color-primary-lighter: #32d8b4;
--ifm-color-primary-lightest: #4fddbf;
--docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.3);
}

View File

@@ -1,97 +1,41 @@
import React from 'react';
import classnames from 'classnames';
import Layout from '@theme/Layout';
import clsx from 'clsx';
import Link from '@docusaurus/Link';
import useDocusaurusContext from '@docusaurus/useDocusaurusContext';
import useBaseUrl from '@docusaurus/useBaseUrl';
import styles from './styles.module.css';
import Layout from '@theme/Layout';
import HomepageFeatures from '@site/src/components/HomepageFeatures';
const features = [
{
title: <>Easy to Use</>,
imageUrl: 'img/undraw_docusaurus_mountain.svg',
description: (
<>
Docusaurus was designed from the ground up to be easily installed and
used to get your website up and running quickly.
</>
),
},
{
title: <>Focus on What Matters</>,
imageUrl: 'img/undraw_docusaurus_tree.svg',
description: (
<>
Docusaurus lets you focus on your docs, and we&apos;ll do the chores. Go
ahead and move your docs into the <code>docs</code> directory.
</>
),
},
{
title: <>Powered by React</>,
imageUrl: 'img/undraw_docusaurus_react.svg',
description: (
<>
Extend or customize your website layout by reusing React. Docusaurus can
be extended while reusing the same header and footer.
</>
),
},
];
import styles from './index.module.css';
function Feature({imageUrl, title, description}) {
const imgUrl = useBaseUrl(imageUrl);
function HomepageHeader() {
const {siteConfig} = useDocusaurusContext();
return (
<div className={classnames('col col--4', styles.feature)}>
{imgUrl && (
<div className="text--center">
<img className={styles.featureImage} src={imgUrl} alt={title} />
<header className={clsx('hero hero--primary', styles.heroBanner)}>
<div className="container">
<h1 className="hero__title">{siteConfig.title}</h1>
<p className="hero__subtitle">{siteConfig.tagline}</p>
<div className={styles.buttons}>
<Link
className="button button--secondary button--lg"
to="/docs/intro">
Docusaurus Tutorial - 5min
</Link>
</div>
)}
<h3>{title}</h3>
<p>{description}</p>
</div>
</div>
</header>
);
}
function Home() {
const context = useDocusaurusContext();
const {siteConfig = {}} = context;
export default function Home() {
const {siteConfig} = useDocusaurusContext();
return (
<Layout
title={`Hello from ${siteConfig.title}`}
description="Description will go into a meta tag in <head />">
<header className={classnames('hero hero--primary', styles.heroBanner)}>
<div className="container">
<h1 className="hero__title">{siteConfig.title}</h1>
<p className="hero__subtitle">{siteConfig.tagline}</p>
<div className={styles.buttons}>
<Link
className={classnames(
'button button--outline button--secondary button--lg',
styles.getStarted,
)}
to={useBaseUrl('docs/doc1')}>
Get Started
</Link>
</div>
</div>
</header>
<HomepageHeader />
<main>
{features && features.length && (
<section className={styles.features}>
<div className="container">
<div className="row">
{features.map((props, idx) => (
<Feature key={idx} {...props} />
))}
</div>
</div>
</section>
)}
<HomepageFeatures />
</main>
</Layout>
);
}
export default Home;

View File

@@ -1,4 +1,3 @@
/* stylelint-disable docusaurus/copyright-header */
/**
* CSS files with the .module.css suffix will be treated as CSS modules
* and scoped locally.
@@ -11,7 +10,7 @@
overflow: hidden;
}
@media screen and (max-width: 966px) {
@media screen and (max-width: 996px) {
.heroBanner {
padding: 2rem;
}
@@ -22,15 +21,3 @@
align-items: center;
justify-content: center;
}
.features {
display: flex;
align-items: center;
padding: 2rem 0;
width: 100%;
}
.featureImage {
height: 200px;
width: 200px;
}

View File

@@ -0,0 +1,7 @@
---
title: Markdown page example
---
# Markdown page example
You don't need React to write simple standalone pages.

View File

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 766 B

After

Width:  |  Height:  |  Size: 3.5 KiB

View File

@@ -1,4 +1,5 @@
<svg xmlns="http://www.w3.org/2000/svg" width="1088" height="687.962" viewBox="0 0 1088 687.962">
<title>Easy to Use</title>
<g id="Group_12" data-name="Group 12" transform="translate(-57 -56)">
<g id="Group_11" data-name="Group 11" transform="translate(57 56)">
<path id="Path_83" data-name="Path 83" d="M1017.81,560.461c-5.27,45.15-16.22,81.4-31.25,110.31-20,38.52-54.21,54.04-84.77,70.28a193.275,193.275,0,0,1-27.46,11.94c-55.61,19.3-117.85,14.18-166.74,3.99a657.282,657.282,0,0,0-104.09-13.16q-14.97-.675-29.97-.67c-15.42.02-293.07,5.29-360.67-131.57-16.69-33.76-28.13-75-32.24-125.27-11.63-142.12,52.29-235.46,134.74-296.47,155.97-115.41,369.76-110.57,523.43,7.88C941.15,276.621,1036.99,396.031,1017.81,560.461Z" transform="translate(-56 -106.019)" fill="#3f3d56"/>

Before

Width:  |  Height:  |  Size: 31 KiB

After

Width:  |  Height:  |  Size: 31 KiB

View File

@@ -1,4 +1,5 @@
<svg xmlns="http://www.w3.org/2000/svg" width="1041.277" height="554.141" viewBox="0 0 1041.277 554.141">
<title>Powered by React</title>
<g id="Group_24" data-name="Group 24" transform="translate(-440 -263)">
<g id="Group_23" data-name="Group 23" transform="translate(439.989 262.965)">
<path id="Path_299" data-name="Path 299" d="M1040.82,611.12q-1.74,3.75-3.47,7.4-2.7,5.67-5.33,11.12c-.78,1.61-1.56,3.19-2.32,4.77-8.6,17.57-16.63,33.11-23.45,45.89A73.21,73.21,0,0,1,942.44,719l-151.65,1.65h-1.6l-13,.14-11.12.12-34.1.37h-1.38l-17.36.19h-.53l-107,1.16-95.51,1-11.11.12-69,.75H429l-44.75.48h-.48l-141.5,1.53-42.33.46a87.991,87.991,0,0,1-10.79-.54h0c-1.22-.14-2.44-.3-3.65-.49a87.38,87.38,0,0,1-51.29-27.54C116,678.37,102.75,655,93.85,629.64q-1.93-5.49-3.6-11.12C59.44,514.37,97,380,164.6,290.08q4.25-5.64,8.64-11l.07-.08c20.79-25.52,44.1-46.84,68.93-62,44-26.91,92.75-34.49,140.7-11.9,40.57,19.12,78.45,28.11,115.17,30.55,3.71.24,7.42.42,11.11.53,84.23,2.65,163.17-27.7,255.87-47.29,3.69-.78,7.39-1.55,11.12-2.28,66.13-13.16,139.49-20.1,226.73-5.51a189.089,189.089,0,0,1,26.76,6.4q5.77,1.86,11.12,4c41.64,16.94,64.35,48.24,74,87.46q1.37,5.46,2.37,11.11C1134.3,384.41,1084.19,518.23,1040.82,611.12Z" transform="translate(-79.34 -172.91)" fill="#f2f2f2"/>

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 12 KiB

After

Width:  |  Height:  |  Size: 12 KiB

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/build-utils",
"version": "5.0.6",
"version": "5.2.0",
"license": "MIT",
"main": "./dist/index.js",
"types": "./dist/index.d.js",
@@ -14,8 +14,7 @@
"build": "node build",
"test": "jest --env node --verbose --runInBand --bail",
"test-unit": "yarn test test/unit.*test.*",
"test-integration-once": "yarn test test/integration.test.ts",
"prepublishOnly": "node build"
"test-integration-once": "yarn test test/integration.test.ts"
},
"devDependencies": {
"@iarna/toml": "2.2.3",
@@ -31,7 +30,7 @@
"@types/node-fetch": "^2.1.6",
"@types/semver": "6.0.0",
"@types/yazl": "2.4.2",
"@vercel/ncc": "0.34.0",
"@vercel/ncc": "0.24.0",
"aggregate-error": "3.0.1",
"async-retry": "1.2.3",
"async-sema": "2.1.4",

View File

@@ -27,9 +27,7 @@ async function prepareSymlinkTarget(
}
if (file.type === 'FileRef' || file.type === 'FileBlob') {
const targetPathBufferPromise = await streamToBuffer(
await file.toStreamAsync()
);
const targetPathBufferPromise = streamToBuffer(await file.toStreamAsync());
const [targetPathBuffer] = await Promise.all([
targetPathBufferPromise,
mkdirPromise,
@@ -42,9 +40,15 @@ async function prepareSymlinkTarget(
);
}
async function downloadFile(file: File, fsPath: string): Promise<FileFsRef> {
export async function downloadFile(
file: File,
fsPath: string
): Promise<FileFsRef> {
const { mode } = file;
// If the source is a symlink, try to create it instead of copying the file.
// Note: creating symlinks on Windows requires admin priviliges or symlinks
// enabled in the group policy. We may want to improve the error message.
if (isSymbolicLink(mode)) {
const target = await prepareSymlinkTarget(file, fsPath);
@@ -92,12 +96,28 @@ export default async function download(
await removeFile(basePath, name);
return;
}
// If a file didn't change, do not re-download it.
if (Array.isArray(filesChanged) && !filesChanged.includes(name)) {
return;
}
// Some builders resolve symlinks and return both
// a file, node_modules/<symlink>/package.json, and
// node_modules/<symlink>, a symlink.
// Removing the file matches how the yazl lambda zip
// behaves so we can use download() with `vercel build`.
const parts = name.split('/');
for (let i = 1; i < parts.length; i++) {
const dir = parts.slice(0, i).join('/');
const parent = files[dir];
if (parent && isSymbolicLink(parent.mode)) {
console.warn(
`Warning: file "${name}" is within a symlinked directory "${dir}" and will be ignored`
);
return;
}
}
const file = files[name];
const fsPath = path.join(basePath, name);

View File

@@ -4,7 +4,11 @@ import FileRef from './file-ref';
import { Lambda, createLambda, getLambdaOptionsFromFunction } from './lambda';
import { NodejsLambda } from './nodejs-lambda';
import { Prerender } from './prerender';
import download, { DownloadedFiles, isSymbolicLink } from './fs/download';
import download, {
downloadFile,
DownloadedFiles,
isSymbolicLink,
} from './fs/download';
import getWriteableDirectory from './fs/get-writable-directory';
import glob, { GlobOptions } from './fs/glob';
import rename from './fs/rename';
@@ -46,6 +50,7 @@ export {
createLambda,
Prerender,
download,
downloadFile,
DownloadedFiles,
getWriteableDirectory,
glob,

View File

@@ -22,6 +22,7 @@ export interface LambdaOptionsBase {
allowQuery?: string[];
regions?: string[];
supportsMultiPayloads?: boolean;
supportsWrapper?: boolean;
}
export interface LambdaOptionsWithFiles extends LambdaOptionsBase {
@@ -58,6 +59,7 @@ export class Lambda {
*/
zipBuffer?: Buffer;
supportsMultiPayloads?: boolean;
supportsWrapper?: boolean;
constructor(opts: LambdaOptions) {
const {
@@ -69,6 +71,7 @@ export class Lambda {
allowQuery,
regions,
supportsMultiPayloads,
supportsWrapper,
} = opts;
if ('files' in opts) {
assert(typeof opts.files === 'object', '"files" must be an object');
@@ -103,6 +106,13 @@ export class Lambda {
);
}
if (supportsWrapper !== undefined) {
assert(
typeof supportsWrapper === 'boolean',
'"supportsWrapper" is not a boolean'
);
}
if (regions !== undefined) {
assert(Array.isArray(regions), '"regions" is not an Array');
assert(
@@ -121,6 +131,7 @@ export class Lambda {
this.regions = regions;
this.zipBuffer = 'zipBuffer' in opts ? opts.zipBuffer : undefined;
this.supportsMultiPayloads = supportsMultiPayloads;
this.supportsWrapper = supportsWrapper;
}
async createZip(): Promise<Buffer> {

View File

@@ -170,6 +170,53 @@ it('should create zip files with symlinks properly', async () => {
assert(aStat.isFile());
});
it('should download symlinks even with incorrect file', async () => {
if (process.platform === 'win32') {
console.log('Skipping test on windows');
return;
}
const files = {
'dir/file.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'file text',
}),
linkdir: new FileBlob({
mode: 41453,
contentType: undefined,
data: 'dir',
}),
'linkdir/file.txt': new FileBlob({
mode: 33188,
contentType: undefined,
data: 'this file should be discarded',
}),
};
const outDir = path.join(__dirname, 'symlinks-out');
await fs.remove(outDir);
await fs.mkdirp(outDir);
await download(files, outDir);
const [dir, file, linkdir] = await Promise.all([
fs.lstat(path.join(outDir, 'dir')),
fs.lstat(path.join(outDir, 'dir/file.txt')),
fs.lstat(path.join(outDir, 'linkdir')),
]);
expect(dir.isFile()).toBe(false);
expect(dir.isSymbolicLink()).toBe(false);
expect(file.isFile()).toBe(true);
expect(file.isSymbolicLink()).toBe(false);
expect(linkdir.isSymbolicLink()).toBe(true);
expect(warningMessages).toEqual([
'Warning: file "linkdir/file.txt" is within a symlinked directory "linkdir" and will be ignored',
]);
});
it('should only match supported node versions, otherwise throw an error', async () => {
expect(await getSupportedNodeVersion('12.x', false)).toHaveProperty(
'major',

View File

@@ -1,6 +1,6 @@
{
"name": "vercel",
"version": "27.3.1",
"version": "27.3.7",
"preferGlobal": true,
"license": "Apache-2.0",
"description": "The command-line interface for Vercel",
@@ -16,7 +16,6 @@
"test-unit": "yarn test test/unit/",
"test-integration-cli": "rimraf test/fixtures/integration && ava test/integration.js --serial --fail-fast --verbose",
"test-integration-dev": "yarn test test/dev/",
"prepublishOnly": "yarn build",
"coverage": "codecov",
"build": "ts-node ./scripts/build.ts",
"dev": "ts-node ./src/index.ts"
@@ -42,16 +41,16 @@
"node": ">= 14"
},
"dependencies": {
"@vercel/build-utils": "5.0.6",
"@vercel/go": "2.0.10",
"@vercel/hydrogen": "0.0.7",
"@vercel/next": "3.1.10",
"@vercel/node": "2.5.1",
"@vercel/python": "3.1.2",
"@vercel/redwood": "1.0.11",
"@vercel/remix": "1.0.12",
"@vercel/ruby": "1.3.18",
"@vercel/static-build": "1.0.11",
"@vercel/build-utils": "5.2.0",
"@vercel/go": "2.0.15",
"@vercel/hydrogen": "0.0.12",
"@vercel/next": "3.1.15",
"@vercel/node": "2.5.6",
"@vercel/python": "3.1.7",
"@vercel/redwood": "1.0.16",
"@vercel/remix": "1.0.17",
"@vercel/ruby": "1.3.23",
"@vercel/static-build": "1.0.16",
"update-notifier": "5.1.0"
},
"devDependencies": {
@@ -97,11 +96,11 @@
"@types/which": "1.3.2",
"@types/write-json-file": "2.2.1",
"@types/yauzl-promise": "2.1.0",
"@vercel/client": "12.1.5",
"@vercel/frameworks": "1.1.1",
"@vercel/fs-detectors": "2.0.2",
"@vercel/client": "12.1.10",
"@vercel/frameworks": "1.1.3",
"@vercel/fs-detectors": "2.0.5",
"@vercel/fun": "1.0.4",
"@vercel/ncc": "0.34.0",
"@vercel/ncc": "0.24.0",
"@zeit/source-map-support": "0.6.2",
"ajv": "6.12.2",
"alpha-sort": "2.0.1",

View File

@@ -469,6 +469,8 @@ async function doBuild(
)
);
} catch (err: any) {
output.prettyError(err);
const writeConfigJsonPromise = fs.writeJSON(
join(outputDir, 'config.json'),
{ version: 3 },

View File

@@ -130,6 +130,12 @@ export default async function pull(
await outputFile(fullPath, contents, 'utf8');
if (deltaString) {
output.print('\n' + deltaString);
} else if (oldEnv && exists) {
output.log('No changes found.');
}
output.print(
`${prependEmoji(
`${exists ? 'Updated' : 'Created'} ${chalk.bold(
@@ -139,13 +145,6 @@ export default async function pull(
)}\n`
);
output.print('\n');
if (deltaString) {
output.print(deltaString);
} else if (oldEnv && exists) {
output.log('No changes found.');
}
return 0;
}

View File

@@ -15,6 +15,7 @@ import { Build } from '../types';
import title from 'title';
import { isErrnoException } from '../util/is-error';
import { isAPIError } from '../util/errors-ts';
import { URL } from 'url';
const help = () => {
console.log(`
@@ -66,7 +67,7 @@ export default async function main(client: Client) {
const { print, log, error } = client.output;
// extract the first parameter
const [, deploymentIdOrHost] = argv._;
let [, deploymentIdOrHost] = argv._;
if (argv._.length !== 2) {
error(`${getCommandName('inspect <url>')} expects exactly one argument`);
@@ -90,12 +91,16 @@ export default async function main(client: Client) {
throw err;
}
// resolve the deployment, since we might have been given an alias
const depFetchStart = Date.now();
try {
deploymentIdOrHost = new URL(deploymentIdOrHost).hostname;
} catch {}
client.output.spinner(
`Fetching deployment "${deploymentIdOrHost}" in ${chalk.bold(contextName)}`
);
// resolve the deployment, since we might have been given an alias
try {
deployment = await getDeployment(client, deploymentIdOrHost);
} catch (err: unknown) {

View File

@@ -226,7 +226,8 @@ async function run({ output, contextName, currentTeam, client }) {
if (theSecret) {
const yes =
argv.yes || (await readConfirmation(output, theSecret, contextName));
argv.yes ||
(await readConfirmation(client, output, theSecret, contextName));
if (!yes) {
output.print(`Aborted. Secret not deleted.\n`);
return 0;
@@ -353,7 +354,7 @@ async function run({ output, contextName, currentTeam, client }) {
return 2;
}
async function readConfirmation(output, secret, contextName) {
async function readConfirmation(client, output, secret, contextName) {
const time = chalk.gray(`${ms(new Date() - new Date(secret.created))} ago`);
const tbl = table([[chalk.bold(secret.name), time]], {
align: ['r', 'l'],
@@ -367,5 +368,5 @@ async function readConfirmation(output, secret, contextName) {
);
output.print(` ${tbl}\n`);
return confirm(`${chalk.bold.red('Are you sure?')}`, false);
return confirm(client, `${chalk.bold.red('Are you sure?')}`, false);
}

View File

@@ -5,7 +5,7 @@ try {
// Test to see if cwd has been deleted before
// importing 3rd party packages that might need cwd.
process.cwd();
} catch (err) {
} catch (err: unknown) {
if (isError(err) && err.message.includes('uv_cwd')) {
console.error('Error! The current working directory does not exist.');
process.exit(1);
@@ -40,8 +40,8 @@ import getConfig from './util/get-config';
import * as configFiles from './util/config/files';
import getGlobalPathConfig from './util/config/global-path';
import {
getDefaultConfig,
getDefaultAuthConfig,
defaultAuthConfig,
defaultGlobalConfig,
} from './util/config/get-default';
import * as ERRORS from './util/errors-ts';
import { APIError } from './util/errors-ts';
@@ -50,7 +50,7 @@ import getUpdateCommand from './util/get-update-command';
import { metrics, shouldCollectMetrics } from './util/metrics';
import { getCommandName, getTitleName } from './util/pkg-name';
import doLoginPrompt from './util/login/prompt';
import { GlobalConfig } from './types';
import { AuthConfig, GlobalConfig } from './types';
import { VercelConfig } from '@vercel/client';
const isCanary = pkg.version.includes('canary');
@@ -208,160 +208,59 @@ const main = async () => {
VERCEL_DIR
)}" ${errorToString(err)}`
);
}
let migrated = false;
let configExists;
try {
configExists = existsSync(VERCEL_CONFIG_PATH);
} catch (err: unknown) {
console.error(
error(
`${
'An unexpected error occurred while trying to find the ' +
`config file "${hp(VERCEL_CONFIG_PATH)}" `
}${errorToString(err)}`
)
);
return 0;
}
let config: GlobalConfig | null = null;
if (configExists) {
try {
config = configFiles.readConfigFile();
} catch (err) {
console.error(
error(
`${
'An unexpected error occurred while trying to read the ' +
`config in "${hp(VERCEL_CONFIG_PATH)}" `
}${errorToString(err)}`
)
);
return 1;
}
// This is from when Vercel CLI supported
// multiple providers. In that case, we really
// need to migrate.
if (
// @ts-ignore
config.sh ||
// @ts-ignore
config.user ||
// @ts-ignore
typeof config.user === 'object' ||
typeof config.currentTeam === 'object'
) {
configExists = false;
}
}
if (!configExists) {
const results = await getDefaultConfig(config);
config = results.config;
migrated = results.migrated;
try {
configFiles.writeToConfigFile(config);
} catch (err: unknown) {
console.error(
error(
`${
'An unexpected error occurred while trying to write the ' +
`default config to "${hp(VERCEL_CONFIG_PATH)}" `
}${errorToString(err)}`
)
);
return 1;
}
}
let authConfigExists;
try {
authConfigExists = existsSync(VERCEL_AUTH_CONFIG_PATH);
} catch (err: unknown) {
console.error(
error(
`${
'An unexpected error occurred while trying to find the ' +
`auth file "${hp(VERCEL_AUTH_CONFIG_PATH)}" `
}${errorToString(err)}`
)
);
return 1;
}
let authConfig = null;
const subcommandsWithoutToken = [
'login',
'logout',
'help',
'init',
'update',
'build',
];
if (authConfigExists) {
try {
authConfig = configFiles.readAuthConfigFile();
} catch (err: unknown) {
console.error(
error(
`${
'An unexpected error occurred while trying to read the ' +
`auth config in "${hp(VERCEL_AUTH_CONFIG_PATH)}" `
}${errorToString(err)}`
)
);
return 1;
}
// This is from when Vercel CLI supported
// multiple providers. In that case, we really
// need to migrate.
// @ts-ignore
if (authConfig.credentials) {
authConfigExists = false;
}
} else {
const results = await getDefaultAuthConfig(authConfig);
authConfig = results.config;
migrated = results.migrated;
try {
configFiles.writeToAuthConfigFile(authConfig);
} catch (err: unknown) {
console.error(
error(
`${
'An unexpected error occurred while trying to write the ' +
`default config to "${hp(VERCEL_AUTH_CONFIG_PATH)}" `
}${errorToString(err)}`
)
let config: GlobalConfig;
try {
config = configFiles.readConfigFile();
} catch (err: unknown) {
if (isErrnoException(err) && err.code === 'ENOENT') {
config = defaultGlobalConfig;
try {
configFiles.writeToConfigFile(config);
} catch (err: unknown) {
output.error(
`An unexpected error occurred while trying to save the config to "${hp(
VERCEL_CONFIG_PATH
)}" ${errorToString(err)}`
);
return 1;
}
} else {
output.error(
`An unexpected error occurred while trying to read the config in "${hp(
VERCEL_CONFIG_PATH
)}" ${errorToString(err)}`
);
return 1;
}
}
// Let the user know we migrated the config
if (migrated) {
const directory = param(hp(VERCEL_DIR));
debug(
`The credentials and configuration within the ${directory} directory were upgraded`
);
let authConfig: AuthConfig;
try {
authConfig = configFiles.readAuthConfigFile();
} catch (err: unknown) {
if (isErrnoException(err) && err.code === 'ENOENT') {
authConfig = defaultAuthConfig;
try {
configFiles.writeToAuthConfigFile(authConfig);
} catch (err: unknown) {
output.error(
`An unexpected error occurred while trying to write the auth config to "${hp(
VERCEL_AUTH_CONFIG_PATH
)}" ${errorToString(err)}`
);
return 1;
}
} else {
output.error(
`An unexpected error occurred while trying to read the auth config in "${hp(
VERCEL_AUTH_CONFIG_PATH
)}" ${errorToString(err)}`
);
return 1;
}
}
if (typeof argv['--api'] === 'string') {
@@ -371,18 +270,12 @@ const main = async () => {
}
try {
// eslint-disable-next-line no-new
new URL(apiUrl);
} catch (err) {
} catch (err: unknown) {
output.error(`Please provide a valid URL instead of ${highlight(apiUrl)}.`);
return 1;
}
if (!config) {
output.error(`Vercel global config was not loaded.`);
return 1;
}
// Shared API `Client` instance for all sub-commands to utilize
client = new Client({
apiUrl,
@@ -430,6 +323,15 @@ const main = async () => {
client.argv.push('-h');
}
const subcommandsWithoutToken = [
'login',
'logout',
'help',
'init',
'update',
'build',
];
// Prompt for login if there is no current token
if (
(!authConfig || !authConfig.token) &&

View File

@@ -20,13 +20,15 @@ export interface JSONObject {
}
export interface AuthConfig {
_?: string;
'// Note'?: string;
'// Docs'?: string;
token?: string;
skipWrite?: boolean;
}
export interface GlobalConfig {
_?: string;
'// Note'?: string;
'// Docs'?: string;
currentTeam?: string;
includeScheme?: string;
collectMetrics?: boolean;

View File

@@ -81,7 +81,7 @@ export async function resolveBuilders(
continue;
}
if (name === '@vercel/static') {
if (name === '@vercel/static' || name === '@now/static') {
// `@vercel/static` is a special-case built-in builder
builders.set(name, {
builder: staticBuilder,

View File

@@ -21,6 +21,7 @@ import {
PackageJson,
Prerender,
download,
downloadFile,
EdgeFunction,
BuildResultBuildOutput,
getLambdaOptionsFromFunction,
@@ -266,9 +267,7 @@ async function writeStaticFile(
const dest = join(outputDir, 'static', fsPath);
await fs.mkdirp(dirname(dest));
// TODO: handle (or skip) symlinks?
const stream = file.toStream();
await pipe(stream, fs.createWriteStream(dest, { mode: file.mode }));
await downloadFile(file, dest);
}
/**

View File

@@ -1,75 +1,15 @@
import { AuthConfig, GlobalConfig } from '../../types';
export const getDefaultConfig = async (existingCopy?: GlobalConfig | null) => {
let migrated = false;
const config: GlobalConfig = {
_: 'This is your Vercel config file. For more information see the global configuration documentation: https://vercel.com/docs/configuration#global',
collectMetrics: true,
};
if (existingCopy) {
const keep = [
'_',
'currentTeam',
'desktop',
'updateChannel',
'collectMetrics',
'api',
// This is deleted later in the code
];
try {
const existing = Object.assign({}, existingCopy);
// @ts-ignore
const sh = Object.assign({}, existing.sh || {});
Object.assign(config, existing, sh);
for (const key of Object.keys(config)) {
if (!keep.includes(key)) {
// @ts-ignore
delete config[key];
}
}
if (typeof config.currentTeam === 'object') {
// @ts-ignore
config.currentTeam = config.currentTeam.id;
}
// @ts-ignore
if (typeof config.user === 'object') {
// @ts-ignore
config.user = config.user.uid || config.user.id;
}
migrated = true;
} catch (err) {}
}
return { config, migrated };
export const defaultGlobalConfig: GlobalConfig = {
'// Note':
'This is your Vercel config file. For more information see the global configuration documentation.',
'// Docs':
'https://vercel.com/docs/project-configuration#global-configuration/config-json',
collectMetrics: true,
};
export const getDefaultAuthConfig = async (existing?: AuthConfig | null) => {
let migrated = false;
const config: AuthConfig = {
_: 'This is your Vercel credentials file. DO NOT SHARE! More: https://vercel.com/docs/configuration#global',
};
if (existing) {
try {
// @ts-ignore
const sh = existing.credentials.find(item => item.provider === 'sh');
if (sh) {
config.token = sh.token;
}
migrated = true;
} catch (err) {}
}
return { config, migrated };
export const defaultAuthConfig: AuthConfig = {
'// Note': 'This is your Vercel credentials file. DO NOT SHARE!',
'// Docs':
'https://vercel.com/docs/project-configuration#global-configuration/auth-json',
};

View File

@@ -1556,6 +1556,8 @@ export default class DevServer {
(err as any).link = 'https://vercel.link/command-not-found';
}
this.output.prettyError(err);
await this.sendError(
req,
res,

View File

@@ -63,18 +63,25 @@ export function buildDeltaString(
const { added, changed, removed } = findChanges(oldEnv, newEnv);
let deltaString = '';
deltaString += chalk.green(addDeltaSection('+', changed, true));
deltaString += chalk.green(addDeltaSection('+', added));
deltaString += chalk.yellow(addDeltaSection('~', changed));
deltaString += chalk.red(addDeltaSection('-', removed));
return deltaString ? chalk.gray('Changes:\n') + deltaString : deltaString;
return deltaString
? chalk.gray('Changes:\n') + deltaString + '\n'
: deltaString;
}
function addDeltaSection(prefix: string, arr: string[]): string {
function addDeltaSection(
prefix: string,
arr: string[],
changed: boolean = false
): string {
if (arr.length === 0) return '';
return (
arr
.sort()
.map(item => `${prefix} ${item}`)
.map(item => `${prefix} ${item}${changed ? ' (Updated)' : ''}`)
.join('\n') + '\n'
);
}

View File

@@ -1,4 +1,4 @@
import decamelize from 'decamelize';
import { snakeCase } from 'snake-case';
import { upper } from '../lib/upper';
export const config = {
@@ -14,8 +14,8 @@ export default async function edge(request, event) {
url: request.url,
method: request.method,
body: requestBody,
decamelized: decamelize('someCamelCaseThing'),
uppercase: upper('someThing'),
snakeCase: snakeCase('someCamelCaseThing'),
upperCase: upper('someThing'),
optionalChaining: request?.doesnotexist ?? 'fallback',
ENV_VAR_IN_EDGE: process.env.ENV_VAR_IN_EDGE,
})

View File

@@ -1,6 +1,6 @@
{
"private": true,
"dependencies": {
"decamelize": "6.0.0"
"snake-case": "3.0.4"
}
}

View File

@@ -2,7 +2,38 @@
# yarn lockfile v1
decamelize@6.0.0:
version "6.0.0"
resolved "https://registry.yarnpkg.com/decamelize/-/decamelize-6.0.0.tgz#8cad4d916fde5c41a264a43d0ecc56fe3d31749e"
integrity sha512-Fv96DCsdOgB6mdGl67MT5JaTNKRzrzill5OH5s8bjYJXVlcXyPYGyPsUkWyGV5p1TXI5esYIYMMeDJL0hEIwaA==
dot-case@^3.0.4:
version "3.0.4"
resolved "https://registry.yarnpkg.com/dot-case/-/dot-case-3.0.4.tgz#9b2b670d00a431667a8a75ba29cd1b98809ce751"
integrity sha512-Kv5nKlh6yRrdrGvxeJ2e5y2eRUpkUosIW4A2AS38zwSz27zu7ufDwQPi5Jhs3XAlGNetl3bmnGhQsMtkKJnj3w==
dependencies:
no-case "^3.0.4"
tslib "^2.0.3"
lower-case@^2.0.2:
version "2.0.2"
resolved "https://registry.yarnpkg.com/lower-case/-/lower-case-2.0.2.tgz#6fa237c63dbdc4a82ca0fd882e4722dc5e634e28"
integrity sha512-7fm3l3NAF9WfN6W3JOmf5drwpVqX78JtoGJ3A6W0a6ZnldM41w2fV5D490psKFTpMds8TJse/eHLFFsNHHjHgg==
dependencies:
tslib "^2.0.3"
no-case@^3.0.4:
version "3.0.4"
resolved "https://registry.yarnpkg.com/no-case/-/no-case-3.0.4.tgz#d361fd5c9800f558551a8369fc0dcd4662b6124d"
integrity sha512-fgAN3jGAh+RoxUGZHTSOLJIqUc2wmoBwGR4tbpNAKmmovFoWq0OdRkb0VkldReO2a2iBT/OEulG9XSUc10r3zg==
dependencies:
lower-case "^2.0.2"
tslib "^2.0.3"
snake-case@3.0.4:
version "3.0.4"
resolved "https://registry.yarnpkg.com/snake-case/-/snake-case-3.0.4.tgz#4f2bbd568e9935abdfd593f34c691dadb49c452c"
integrity sha512-LAOh4z89bGQvl9pFfNF8V146i7o7/CqFPbqzYgP+yYzDIDeS9HaNFtXABamRW+AQzEVODcvE79ljJ+8a9YSdMg==
dependencies:
dot-case "^3.0.4"
tslib "^2.0.3"
tslib@^2.0.3:
version "2.4.0"
resolved "https://registry.yarnpkg.com/tslib/-/tslib-2.4.0.tgz#7cecaa7f073ce680a05847aa77be941098f36dc3"
integrity sha512-d6xOpEDfsi2CZVlPQzGeux8XMwLT9hssAsaPYExaQMuYskwb+x1x7J371tWlbBdWHroy99KnVB6qIkUbs5X3UQ==

View File

@@ -5,6 +5,6 @@ import (
"net/http"
)
func Another(w http.ResponseWriter, r *http.Request) {
func HandlerAnother(w http.ResponseWriter, r *http.Request) {
fmt.Fprintf(w, "This is another page")
}

View File

@@ -0,0 +1,7 @@
export const config = {
matcher: 'not-a-valid-matcher',
};
export default function middleware(request, _event) {
return new Response(null);
}

View File

@@ -43,8 +43,8 @@ test('[vercel dev] should support edge functions', async () => {
url: `http://localhost:${port}/api/edge-success`,
method: 'POST',
body: '{"hello":"world"}',
decamelized: 'some_camel_case_thing',
uppercase: 'SOMETHING',
snakeCase: 'some_camel_case_thing',
upperCase: 'SOMETHING',
optionalChaining: 'fallback',
ENV_VAR_IN_EDGE: '1',
});
@@ -227,7 +227,7 @@ test('[vercel dev] should handle syntax errors thrown in edge functions', async
expect(await res.text()).toMatch(
/<strong>500<\/strong>: INTERNAL_SERVER_ERROR/g
);
expect(stderr).toMatch(/Failed to instantiate edge runtime./g);
expect(stderr).toMatch(/Failed to compile user code for edge runtime./g);
expect(stderr).toMatch(/Unexpected end of file/g);
expect(stderr).toMatch(
/Failed to complete request to \/api\/edge-error-syntax: Error: socket hang up/g
@@ -307,6 +307,35 @@ test('[vercel dev] should handle missing handler errors thrown in edge functions
}
});
test('[vercel dev] should handle invalid middleware config', async () => {
const dir = fixture('middleware-matchers-invalid');
const { dev, port, readyResolver } = await testFixture(dir);
try {
await readyResolver;
let res = await fetch(`http://localhost:${port}/api/whatever`, {
method: 'GET',
headers: {
Accept:
'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
},
});
validateResponseHeaders(res);
const { stderr } = await dev.kill('SIGTERM');
expect(await res.text()).toMatch(
/<strong>500<\/strong>: INTERNAL_SERVER_ERROR/g
);
expect(stderr).toMatch(
/Middleware's `config.matcher` .+ Received: not-a-valid-matcher/g
);
} finally {
await dev.kill('SIGTERM');
}
});
test('[vercel dev] should support request body', async () => {
const dir = fixture('node-request-body');
const { dev, port, readyResolver } = await testFixture(dir);

View File

@@ -347,6 +347,12 @@ function testFixtureStdio(
? ['--scope', process.env.VERCEL_TEAM_ID]
: []),
'deploy',
...(process.env.VERCEL_CLI_VERSION
? [
'--build-env',
`VERCEL_CLI_VERSION=${process.env.VERCEL_CLI_VERSION}`,
]
: []),
'--public',
'--debug',
],
@@ -427,7 +433,7 @@ function testFixtureStdio(
);
}
if (stderr.includes('Command failed') || stderr.includes('Error!')) {
if (stderr.includes('Command failed')) {
dev.kill('SIGTERM');
throw new Error(`Failed for "${directory}" with stderr "${stderr}".`);
}

View File

@@ -0,0 +1,7 @@
{
"orgId": ".",
"projectId": ".",
"settings": {
"framework": null
}
}

View File

@@ -0,0 +1 @@
this file should be ignored

View File

@@ -0,0 +1,8 @@
{
"builds": [
{
"src": "www/**",
"use": "@now/static"
}
]
}

View File

@@ -0,0 +1 @@
<h1>Now</h1>

View File

@@ -0,0 +1,7 @@
{
"orgId": ".",
"projectId": ".",
"settings": {
"framework": null
}
}

View File

@@ -0,0 +1 @@
<h1>Vercel</h1>

View File

@@ -0,0 +1 @@
SPECIAL_FLAG=1

View File

@@ -0,0 +1 @@
!.vercel

View File

@@ -0,0 +1,4 @@
{
"orgId": "team_dummy",
"projectId": "env-pull-delta-no-changes"
}

View File

@@ -342,7 +342,7 @@ module.exports = async function prepare(session, binaryPath, tmpFixturesDir) {
},
'lambda-with-php-runtime': {
'api/test.php': `<?php echo 'Hello from PHP'; ?>`,
'now.json': JSON.stringify({
'vercel.json': JSON.stringify({
functions: {
'api/**/*.php': {
runtime: 'vercel-php@0.1.0',
@@ -352,7 +352,7 @@ module.exports = async function prepare(session, binaryPath, tmpFixturesDir) {
},
'lambda-with-invalid-runtime': {
'api/test.php': `<?php echo 'Hello from PHP'; ?>`,
'now.json': JSON.stringify({
'vercel.json': JSON.stringify({
functions: {
'api/**/*.php': {
memory: 128,

View File

@@ -233,14 +233,6 @@ const waitForPrompt = (cp, assertion) =>
cp.stderr.on('data', listener);
});
const getDeploymentBuildsByUrl = async url => {
const hostRes = await apiFetch(`/v10/now/deployments/get?url=${url}`);
const { id } = await hostRes.json();
const buildsRes = await apiFetch(`/v10/now/deployments/${id}/builds`);
const { builds } = await buildsRes.json();
return builds;
};
const createUser = async () => {
await retry(
async () => {
@@ -2907,11 +2899,10 @@ test('deploy a Lambda with a specific runtime', async t => {
t.is(output.exitCode, 0, formatOutput(output));
const { host: url } = new URL(output.stdout);
const builds = await getDeploymentBuildsByUrl(url);
const build = builds.find(b => b.use && b.use.includes('php')) || builds[0];
t.is(build.use, 'vercel-php@0.1.0', JSON.stringify(build, null, 2));
const url = new URL(output.stdout);
const res = await fetch(`${url}/api/test`);
const text = await res.text();
t.is(text, 'Hello from PHP');
});
test('fail to deploy a Lambda with a specific runtime but without a locked version', async t => {

View File

@@ -159,9 +159,24 @@ export function useProject(project: Partial<Project> = defaultProject) {
res.json({ envs });
});
client.scenario.post(`/v8/projects/${project.id}/env`, (req, res) => {
envs.push(req.body);
const envObj = req.body;
envObj.id = envObj.key;
envs.push(envObj);
res.json({ envs });
});
client.scenario.delete(
`/v8/projects/${project.id}/env/:envId`,
(req, res) => {
const envId = req.params.envId;
for (const [i, env] of envs.entries()) {
if (env.key === envId) {
envs.splice(i, 1);
break;
}
}
res.json(envs);
}
);
client.scenario.post(`/v4/projects/${project.id}/link`, (req, res) => {
const { type, repo, org } = req.body;
if (

View File

@@ -47,6 +47,37 @@ describe('build', () => {
}
});
it('should build with `@now/static`', async () => {
const cwd = fixture('now-static');
const output = join(cwd, '.vercel/output');
try {
process.chdir(cwd);
const exitCode = await build(client);
expect(exitCode).toEqual(0);
const builds = await fs.readJSON(join(output, 'builds.json'));
expect(builds).toMatchObject({
target: 'preview',
builds: [
{
require: '@now/static',
apiVersion: 2,
src: 'www/index.html',
use: '@now/static',
},
],
});
const files = await fs.readdir(join(output, 'static'));
expect(files).toEqual(['www']);
const www = await fs.readdir(join(output, 'static', 'www'));
expect(www).toEqual(['index.html']);
} finally {
process.chdir(originalCwd);
delete process.env.__VERCEL_BUILD_RUNNING;
}
});
it('should build with `@vercel/node`', async () => {
const cwd = fixture('node');
const output = join(cwd, '.vercel/output');
@@ -112,6 +143,54 @@ describe('build', () => {
}
});
it('should handle symlinked static files', async () => {
const cwd = fixture('static-symlink');
const output = join(cwd, '.vercel/output');
// try to create the symlink, if it fails (e.g. Windows), skip the test
try {
await fs.unlink(join(cwd, 'foo.html'));
await fs.symlink(join(cwd, 'index.html'), join(cwd, 'foo.html'));
} catch (e) {
console.log('Symlinks not available, skipping test');
return;
}
try {
process.chdir(cwd);
const exitCode = await build(client);
expect(exitCode).toEqual(0);
// `builds.json` says that "@vercel/static" was run
const builds = await fs.readJSON(join(output, 'builds.json'));
expect(builds).toMatchObject({
target: 'preview',
builds: [
{
require: '@vercel/static',
apiVersion: 2,
src: '**',
use: '@vercel/static',
},
],
});
// "static" directory contains static files
const files = await fs.readdir(join(output, 'static'));
expect(files.sort()).toEqual(['foo.html', 'index.html']);
expect(
(await fs.lstat(join(output, 'static', 'foo.html'))).isSymbolicLink()
).toEqual(true);
expect(
(await fs.lstat(join(output, 'static', 'index.html'))).isSymbolicLink()
).toEqual(false);
} finally {
await fs.unlink(join(cwd, 'foo.html'));
process.chdir(originalCwd);
delete process.env.__VERCEL_BUILD_RUNNING;
}
});
it('should normalize "src" path in `vercel.json`', async () => {
const cwd = fixture('normalize-src');
const output = join(cwd, '.vercel/output');
@@ -630,6 +709,11 @@ describe('build', () => {
const exitCode = await build(client);
expect(exitCode).toEqual(1);
// Error gets printed to the terminal
await expect(client.stderr).toOutput(
'Error! Function must contain at least one property.'
);
// `builds.json` contains top-level "error" property
const builds = await fs.readJSON(join(output, 'builds.json'));
expect(builds.builds).toBeUndefined();
@@ -656,6 +740,9 @@ describe('build', () => {
const exitCode = await build(client);
expect(exitCode).toEqual(1);
// Error gets printed to the terminal
await expect(client.stderr).toOutput("Duplicate identifier 'res'.");
// `builds.json` contains "error" build
const builds = await fs.readJSON(join(output, 'builds.json'));
expect(builds.builds).toHaveLength(4);
@@ -815,7 +902,6 @@ describe('build', () => {
output = join(cwd, '.vercel/output');
process.chdir(cwd);
client.stderr.pipe(process.stderr);
const exitCode = await build(client);
expect(exitCode).toEqual(0);

View File

@@ -148,41 +148,46 @@ describe('env', () => {
it('should show a delta string', async () => {
const cwd = setupFixture('vercel-env-pull-delta');
useUser();
useTeams('team_dummy');
useProject({
...defaultProject,
id: 'env-pull-delta',
name: 'env-pull-delta',
});
try {
useUser();
useTeams('team_dummy');
useProject({
...defaultProject,
id: 'env-pull-delta',
name: 'env-pull-delta',
});
client.setArgv('env', 'add', 'NEW_VAR', '--cwd', cwd);
const addPromise = env(client);
client.setArgv('env', 'add', 'NEW_VAR', '--cwd', cwd);
const addPromise = env(client);
await expect(client.stderr).toOutput('Whats the value of NEW_VAR?');
client.stdin.write('testvalue\n');
await expect(client.stderr).toOutput('Whats the value of NEW_VAR?');
client.stdin.write('testvalue\n');
await expect(client.stderr).toOutput(
'Add NEW_VAR to which Environments (select multiple)?'
);
client.stdin.write('\x1B[B'); // Down arrow
client.stdin.write('\x1B[B');
client.stdin.write(' ');
client.stdin.write('\r');
await expect(client.stderr).toOutput(
'Add NEW_VAR to which Environments (select multiple)?'
);
client.stdin.write('\x1B[B'); // Down arrow
client.stdin.write('\x1B[B');
client.stdin.write(' ');
client.stdin.write('\r');
await expect(addPromise).resolves.toEqual(0);
await expect(addPromise).resolves.toEqual(0);
client.setArgv('env', 'pull', '--yes', '--cwd', cwd);
const pullPromise = env(client);
await expect(client.stderr).toOutput(
'Downloading `development` Environment Variables for Project env-pull-delta'
);
await expect(client.stderr).toOutput('Updated .env file');
await expect(client.stderr).toOutput(
'+ NEW_VAR\n~ SPECIAL_FLAG\n- TEST\n'
);
client.setArgv('env', 'pull', '--yes', '--cwd', cwd);
const pullPromise = env(client);
await expect(client.stderr).toOutput(
'Downloading `development` Environment Variables for Project env-pull-delta'
);
await expect(client.stderr).toOutput(
'+ SPECIAL_FLAG (Updated)\n+ NEW_VAR\n- TEST\n'
);
await expect(client.stderr).toOutput('Updated .env file');
await expect(pullPromise).resolves.toEqual(0);
await expect(pullPromise).resolves.toEqual(0);
} finally {
client.setArgv('env', 'rm', 'NEW_VAR', '--yes', '--cwd', cwd);
await env(client);
}
});
it('should not show a delta string when it fails to read a file', async () => {
@@ -200,5 +205,22 @@ describe('env', () => {
await expect(client.stderr).toOutput('Updated .env file');
await expect(pullPromise).resolves.toEqual(0);
});
it('should show that no changes were found', async () => {
const cwd = setupFixture('vercel-env-pull-delta-no-changes');
useUser();
useTeams('team_dummy');
useProject({
...defaultProject,
id: 'env-pull-delta-no-changes',
name: 'env-pull-delta-no-changes',
});
client.setArgv('env', 'pull', '--yes', '--cwd', cwd);
const pullPromise = env(client);
await expect(client.stderr).toOutput('> No changes found.');
await expect(client.stderr).toOutput('Updated .env file');
await expect(pullPromise).resolves.toEqual(0);
});
});
});

View File

@@ -15,6 +15,17 @@ describe('inspect', () => {
);
});
it('should strip the scheme of a url', async () => {
const user = useUser();
const deployment = useDeployment({ creator: user });
client.setArgv('inspect', `http://${deployment.url}`);
const exitCode = await inspect(client);
expect(exitCode).toEqual(0);
await expect(client.stderr).toOutput(
`> Fetched deployment ${deployment.url} in ${user.username}`
);
});
it('should print error when deployment not found', async () => {
const user = useUser();
useDeployment({ creator: user });

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/client",
"version": "12.1.5",
"version": "12.1.10",
"main": "dist/index.js",
"typings": "dist/index.d.ts",
"homepage": "https://vercel.com",
@@ -42,8 +42,8 @@
]
},
"dependencies": {
"@vercel/build-utils": "5.0.6",
"@vercel/routing-utils": "2.0.0",
"@vercel/build-utils": "5.2.0",
"@vercel/routing-utils": "2.0.2",
"@zeit/fetch": "5.2.0",
"async-retry": "1.2.3",
"async-sema": "3.0.0",

View File

@@ -1,14 +1,14 @@
{
"name": "@vercel/edge",
"version": "0.0.1",
"version": "0.0.3",
"license": "MIT",
"main": "dist/index.js",
"module": "dist/index.mjs",
"types": "dist/index.d.ts",
"scripts": {
"build": "tsup src/index.ts --dts --format esm,cjs",
"test-unit": "jest",
"prepublishOnly": "yarn build"
"test": "jest --env node --verbose --runInBand --bail",
"test-unit": "yarn test"
},
"devDependencies": {
"@edge-runtime/jest-environment": "1.1.0-beta.7",

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/frameworks",
"version": "1.1.1",
"version": "1.1.3",
"main": "./dist/frameworks.js",
"types": "./dist/frameworks.d.ts",
"files": [
@@ -21,7 +21,7 @@
"@types/js-yaml": "3.12.1",
"@types/node": "12.0.4",
"@types/node-fetch": "2.5.8",
"@vercel/routing-utils": "2.0.0",
"@vercel/routing-utils": "2.0.2",
"ajv": "6.12.2",
"typescript": "4.3.4"
}

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/fs-detectors",
"version": "2.0.2",
"version": "2.0.5",
"description": "Vercel filesystem detectors",
"main": "./dist/index.js",
"types": "./dist/index.d.ts",
@@ -14,14 +14,13 @@
},
"license": "MIT",
"scripts": {
"prepublishOnly": "tsc",
"build": "tsc",
"test": "yarn jest --env node --verbose --runInBand --bail test/unit.*test.*",
"test": "jest --env node --verbose --runInBand --bail test/unit.*test.*",
"test-unit": "yarn test"
},
"dependencies": {
"@vercel/frameworks": "1.1.1",
"@vercel/routing-utils": "2.0.0",
"@vercel/frameworks": "1.1.3",
"@vercel/routing-utils": "2.0.2",
"glob": "8.0.3",
"js-yaml": "4.1.0",
"minimatch": "3.0.4",

View File

@@ -99,7 +99,7 @@ export async function detectBuilders(
const errors: ErrorResponse[] = [];
const warnings: ErrorResponse[] = [];
const apiBuilders: Builder[] = [];
let apiBuilders: Builder[] = [];
let frontendBuilder: Builder | null = null;
const functionError = validateFunctions(options);
@@ -305,6 +305,24 @@ export async function detectBuilders(
};
}
// Exclude the middleware builder for Next.js apps since @vercel/next
// will build middlewares.
//
// While maybeGetApiBuilder() excludes the middleware builder, however,
// we need to check if it's a Next.js app here again for the case where
// `projectSettings.framework == null`.
if (
framework === null &&
frontendBuilder?.use === '@vercel/next' &&
apiBuilders.length > 0
) {
apiBuilders = apiBuilders.filter(builder => {
const isMiddlewareBuilder =
builder.use === '@vercel/node' && builder.config?.middleware;
return !isMiddlewareBuilder;
});
}
const builders: Builder[] = [];
if (apiBuilders.length) {

View File

@@ -2281,6 +2281,31 @@ describe('Test `detectBuilders` with `featHandleMiss=true`', () => {
},
]);
});
it('should not add middleware builder when building "nextjs"', async () => {
const files = ['package.json', 'pages/index.ts', 'middleware.ts'];
const pkg = {
scripts: { build: 'next build' },
dependencies: { next: '12.2.0' },
};
const projectSettings = {
framework: null, // "Other" framework
createdAt: Date.parse('2020-02-01'),
};
const { builders } = await detectBuilders(files, pkg, {
projectSettings,
featHandleMiss,
});
expect(builders).toEqual([
{
use: '@vercel/next',
src: 'package.json',
config: {
zeroConfig: true,
},
},
]);
});
});
it('Test `detectRoutes`', async () => {

View File

@@ -12,10 +12,11 @@ import {
mkdirp,
move,
remove,
rmdir,
readdir,
} from 'fs-extra';
import {
BuildOptions,
Meta,
Files,
PrepareCacheOptions,
StartDevServerOptions,
@@ -72,25 +73,23 @@ async function initPrivateGit(credentials: string) {
* which works great for this feature. We also need to add a suffix during `vercel dev`
* since the entrypoint is already stripped of its suffix before build() is called.
*/
async function getRenamedEntrypoint(
entrypoint: string,
files: Files,
meta: Meta
) {
function getRenamedEntrypoint(entrypoint: string): string | undefined {
const filename = basename(entrypoint);
if (filename.startsWith('[')) {
const suffix = meta.isDev && !entrypoint.endsWith('.go') ? '.go' : '';
const newEntrypoint = entrypoint.replace('/[', '/now-bracket[') + suffix;
const file = files[entrypoint];
delete files[entrypoint];
files[newEntrypoint] = file;
const newEntrypoint = entrypoint.replace('/[', '/now-bracket[');
debug(`Renamed entrypoint from ${entrypoint} to ${newEntrypoint}`);
entrypoint = newEntrypoint;
return newEntrypoint;
}
return entrypoint;
return undefined;
}
type UndoFileAction = {
from: string;
to: string | undefined;
};
export const version = 3;
export async function build({
@@ -100,375 +99,456 @@ export async function build({
workPath,
meta = {},
}: BuildOptions) {
if (process.env.GIT_CREDENTIALS && !meta.isDev) {
debug('Initialize Git credentials...');
await initPrivateGit(process.env.GIT_CREDENTIALS);
}
if (process.env.GO111MODULE) {
console.log(`\nManually assigning 'GO111MODULE' is not recommended.
By default:
- 'GO111MODULE=on' If entrypoint package name is not 'main'
- 'GO111MODULE=off' If entrypoint package name is 'main'
We highly recommend you leverage Go Modules in your project.
Learn more: https://github.com/golang/go/wiki/Modules
`);
}
entrypoint = await getRenamedEntrypoint(entrypoint, files, meta);
const entrypointArr = entrypoint.split(sep);
// eslint-disable-next-line prefer-const
let [goPath, outDir] = await Promise.all([
getWriteableDirectory(),
getWriteableDirectory(),
]);
const forceMove = Boolean(meta.isDev);
const goPath = await getWriteableDirectory();
const srcPath = join(goPath, 'src', 'lambda');
let downloadPath = meta.isDev || meta.skipDownload ? workPath : srcPath;
let downloadedFiles = await download(files, downloadPath, meta);
const downloadPath = meta.skipDownload ? workPath : srcPath;
await download(files, downloadPath, meta);
// keep track of file system actions we need to undo
// the keys "from" and "to" refer to what needs to be done
// in order to undo the action, not what the original action was
const undoFileActions: UndoFileAction[] = [];
const undoDirectoryCreation: string[] = [];
debug(`Parsing AST for "${entrypoint}"`);
let analyzed: string;
try {
let goModAbsPathDir = '';
const fileName = 'go.mod';
if (fileName in downloadedFiles) {
goModAbsPathDir = dirname(downloadedFiles[fileName].fsPath);
debug(`Found ${fileName} file in "${goModAbsPathDir}"`);
} else if ('api/go.mod' in downloadedFiles) {
goModAbsPathDir = dirname(downloadedFiles['api/go.mod'].fsPath);
debug(`Found ${fileName} file in "${goModAbsPathDir}"`);
if (process.env.GIT_CREDENTIALS) {
debug('Initialize Git credentials...');
await initPrivateGit(process.env.GIT_CREDENTIALS);
}
analyzed = await getAnalyzedEntrypoint(
workPath,
downloadedFiles[entrypoint].fsPath,
goModAbsPathDir
);
} catch (err) {
console.log(`Failed to parse AST for "${entrypoint}"`);
throw err;
}
if (!analyzed) {
const err = new Error(
`Could not find an exported function in "${entrypoint}"
Learn more: https://vercel.com/docs/runtimes#official-runtimes/go
`
);
console.log(err.message);
throw err;
}
if (process.env.GO111MODULE) {
console.log(`\nManually assigning 'GO111MODULE' is not recommended.
const parsedAnalyzed = JSON.parse(analyzed) as Analyzed;
By default:
- 'GO111MODULE=on' If entrypoint package name is not 'main'
- 'GO111MODULE=off' If entrypoint package name is 'main'
if (meta.isDev) {
// Create cache so Go rebuilds fast with `vercel dev`
// Old versions of the CLI don't assign this property
const { devCacheDir = join(workPath, '.now', 'cache') } = meta;
goPath = join(devCacheDir, 'go', basename(entrypoint, '.go'));
const destLambda = join(goPath, 'src', 'lambda');
await download(downloadedFiles, destLambda);
downloadedFiles = await glob('**', destLambda);
downloadPath = destLambda;
}
We highly recommend you leverage Go Modules in your project.
Learn more: https://github.com/golang/go/wiki/Modules
`);
}
// find `go.mod` in downloadedFiles
const entrypointDirname = dirname(downloadedFiles[entrypoint].fsPath);
let isGoModExist = false;
let goModPath = '';
let isGoModInRootDir = false;
for (const file of Object.keys(downloadedFiles)) {
const { fsPath } = downloadedFiles[file];
const fileDirname = dirname(fsPath);
if (file === 'go.mod') {
isGoModExist = true;
isGoModInRootDir = true;
goModPath = fileDirname;
} else if (file.endsWith('go.mod')) {
if (entrypointDirname === fileDirname) {
isGoModExist = true;
goModPath = fileDirname;
debug(`Found file dirname equals entrypoint dirname: ${fileDirname}`);
break;
const renamedEntrypoint = getRenamedEntrypoint(entrypoint);
if (renamedEntrypoint) {
await move(join(workPath, entrypoint), join(workPath, renamedEntrypoint));
undoFileActions.push({
to: join(workPath, entrypoint),
from: join(workPath, renamedEntrypoint),
});
entrypoint = renamedEntrypoint;
}
const entrypointAbsolute = join(workPath, entrypoint);
const entrypointArr = entrypoint.split(sep);
debug(`Parsing AST for "${entrypoint}"`);
let analyzed: string;
try {
const goModAbsPath = await findGoModPath(workPath);
if (goModAbsPath) {
debug(`Found ${goModAbsPath}"`);
}
if (!isGoModInRootDir && config.zeroConfig && file === 'api/go.mod') {
// We didn't find `/go.mod` but we found `/api/go.mod` so move it to the root
analyzed = await getAnalyzedEntrypoint(
workPath,
entrypointAbsolute,
dirname(goModAbsPath)
);
} catch (err) {
console.log(`Failed to parse AST for "${entrypoint}"`);
throw err;
}
if (!analyzed) {
const err = new Error(
`Could not find an exported function in "${entrypoint}"
Learn more: https://vercel.com/docs/runtimes#official-runtimes/go
`
);
console.log(err.message);
throw err;
}
const parsedAnalyzed = JSON.parse(analyzed) as Analyzed;
// find `go.mod` in modFiles
const entrypointDirname = dirname(entrypointAbsolute);
let isGoModExist = false;
let goModPath = '';
let isGoModInRootDir = false;
const modFileRefs = await glob('**/*.mod', workPath);
const modFiles = Object.keys(modFileRefs);
for (const file of modFiles) {
const fileDirname = dirname(file);
if (file === 'go.mod') {
isGoModExist = true;
isGoModInRootDir = true;
goModPath = join(fileDirname, '..');
const pathParts = fsPath.split(sep);
pathParts.pop(); // Remove go.mod
pathParts.pop(); // Remove api
pathParts.push('go.mod');
const newFsPath = pathParts.join(sep);
debug(`Moving api/go.mod to root: ${fsPath} to ${newFsPath}`);
await move(fsPath, newFsPath, { overwrite: forceMove });
const oldSumPath = join(dirname(fsPath), 'go.sum');
const newSumPath = join(dirname(newFsPath), 'go.sum');
if (await pathExists(oldSumPath)) {
debug(`Moving api/go.sum to root: ${oldSumPath} to ${newSumPath}`);
await move(oldSumPath, newSumPath, { overwrite: forceMove });
goModPath = join(workPath, fileDirname);
} else if (file.endsWith('go.mod')) {
if (entrypointDirname === fileDirname) {
isGoModExist = true;
goModPath = join(workPath, fileDirname);
debug(`Found file dirname equals entrypoint dirname: ${fileDirname}`);
break;
}
if (!isGoModInRootDir && config.zeroConfig && file === 'api/go.mod') {
// We didn't find `/go.mod` but we found `/api/go.mod` so move it to the root
isGoModExist = true;
isGoModInRootDir = true;
goModPath = join(fileDirname, '..');
const pathParts = file.split(sep);
pathParts.pop(); // Remove go.mod
pathParts.pop(); // Remove api
pathParts.push('go.mod');
const newRoot = pathParts.join(sep);
const newFsPath = join(workPath, newRoot);
debug(`Moving api/go.mod to root: ${file} to ${newFsPath}`);
await move(file, newFsPath);
undoFileActions.push({
to: file,
from: newFsPath,
});
const oldSumPath = join(dirname(file), 'go.sum');
const newSumPath = join(dirname(newFsPath), 'go.sum');
if (await pathExists(oldSumPath)) {
debug(`Moving api/go.sum to root: ${oldSumPath} to ${newSumPath}`);
await move(oldSumPath, newSumPath);
undoFileActions.push({
to: oldSumPath,
from: newSumPath,
});
}
break;
}
break;
}
}
}
const input = entrypointDirname;
const includedFiles: Files = {};
const input = entrypointDirname;
const includedFiles: Files = {};
if (config && config.includeFiles) {
const patterns = Array.isArray(config.includeFiles)
? config.includeFiles
: [config.includeFiles];
for (const pattern of patterns) {
const fsFiles = await glob(pattern, input);
for (const [assetName, asset] of Object.entries(fsFiles)) {
includedFiles[assetName] = asset;
if (config && config.includeFiles) {
const patterns = Array.isArray(config.includeFiles)
? config.includeFiles
: [config.includeFiles];
for (const pattern of patterns) {
const fsFiles = await glob(pattern, input);
for (const [assetName, asset] of Object.entries(fsFiles)) {
includedFiles[assetName] = asset;
}
}
}
}
const handlerFunctionName = parsedAnalyzed.functionName;
debug(`Found exported function "${handlerFunctionName}" in "${entrypoint}"`);
if (!isGoModExist && 'vendor' in downloadedFiles) {
throw new Error('`go.mod` is required to use a `vendor` directory.');
}
// check if package name other than main
// using `go.mod` way building the handler
const packageName = parsedAnalyzed.packageName;
if (isGoModExist && packageName === 'main') {
throw new Error('Please change `package main` to `package handler`');
}
if (packageName !== 'main') {
const go = await createGo(
workPath,
goPath,
process.platform,
process.arch,
{
cwd: entrypointDirname,
},
true
const handlerFunctionName = parsedAnalyzed.functionName;
debug(
`Found exported function "${handlerFunctionName}" in "${entrypoint}"`
);
if (!isGoModExist) {
try {
const defaultGoModContent = `module ${packageName}`;
await writeFile(join(entrypointDirname, 'go.mod'), defaultGoModContent);
if (!isGoModExist) {
if (await pathExists(join(workPath, 'vendor'))) {
throw new Error('`go.mod` is required to use a `vendor` directory.');
}
}
// check if package name other than main
// using `go.mod` way building the handler
const packageName = parsedAnalyzed.packageName;
if (isGoModExist && packageName === 'main') {
throw new Error('Please change `package main` to `package handler`');
}
const outDir = await getWriteableDirectory();
if (packageName !== 'main') {
const go = await createGo(
workPath,
goPath,
process.platform,
process.arch,
{
cwd: entrypointDirname,
},
true
);
if (!isGoModExist) {
try {
const defaultGoModContent = `module ${packageName}`;
await writeFile(
join(entrypointDirname, 'go.mod'),
defaultGoModContent
);
undoFileActions.push({
to: undefined, // delete
from: join(entrypointDirname, 'go.mod'),
});
// remove the `go.sum` file that will be generated as well
undoFileActions.push({
to: undefined, // delete
from: join(entrypointDirname, 'go.sum'),
});
} catch (err) {
console.log(`Failed to create default go.mod for ${packageName}`);
throw err;
}
}
const mainModGoFileName = 'main__mod__.go';
const modMainGoContents = await readFile(
join(__dirname, mainModGoFileName),
'utf8'
);
let goPackageName = `${packageName}/${packageName}`;
const goFuncName = `${packageName}.${handlerFunctionName}`;
if (isGoModExist) {
const goModContents = await readFile(join(goModPath, 'go.mod'), 'utf8');
const usrModName = goModContents.split('\n')[0].split(' ')[1];
if (entrypointArr.length > 1 && isGoModInRootDir) {
const cleanPackagePath = [...entrypointArr];
cleanPackagePath.pop();
goPackageName = `${usrModName}/${cleanPackagePath.join('/')}`;
} else {
goPackageName = `${usrModName}/${packageName}`;
}
}
const mainModGoContents = modMainGoContents
.replace('__VC_HANDLER_PACKAGE_NAME', goPackageName)
.replace('__VC_HANDLER_FUNC_NAME', goFuncName);
if (isGoModExist && isGoModInRootDir) {
debug('[mod-root] Write main file to ' + downloadPath);
await writeFile(
join(downloadPath, mainModGoFileName),
mainModGoContents
);
undoFileActions.push({
to: undefined, // delete
from: join(downloadPath, mainModGoFileName),
});
} else if (isGoModExist && !isGoModInRootDir) {
debug('[mod-other] Write main file to ' + goModPath);
await writeFile(join(goModPath, mainModGoFileName), mainModGoContents);
undoFileActions.push({
to: undefined, // delete
from: join(goModPath, mainModGoFileName),
});
} else {
debug('[entrypoint] Write main file to ' + entrypointDirname);
await writeFile(
join(entrypointDirname, mainModGoFileName),
mainModGoContents
);
undoFileActions.push({
to: undefined, // delete
from: join(entrypointDirname, mainModGoFileName),
});
}
// move user go file to folder
try {
// default path
let finalDestination = join(entrypointDirname, packageName, entrypoint);
// if `entrypoint` include folder, only use filename
if (entrypointArr.length > 1) {
finalDestination = join(
entrypointDirname,
packageName,
entrypointArr[entrypointArr.length - 1]
);
}
if (dirname(entrypointAbsolute) === goModPath || !isGoModExist) {
debug(
`moving entrypoint "${entrypointAbsolute}" to "${finalDestination}"`
);
await move(entrypointAbsolute, finalDestination);
undoFileActions.push({
to: entrypointAbsolute,
from: finalDestination,
});
undoDirectoryCreation.push(dirname(finalDestination));
}
} catch (err) {
console.log(`Failed to create default go.mod for ${packageName}`);
console.log('Failed to move entry to package folder');
throw err;
}
let baseGoModPath = '';
if (isGoModExist && isGoModInRootDir) {
baseGoModPath = downloadPath;
} else if (isGoModExist && !isGoModInRootDir) {
baseGoModPath = goModPath;
} else {
baseGoModPath = entrypointDirname;
}
debug('Tidy `go.mod` file...');
try {
// ensure go.mod up-to-date
await go.mod();
} catch (err) {
console.log('failed to `go mod tidy`');
throw err;
}
debug('Running `go build`...');
const destPath = join(outDir, handlerFileName);
try {
const src = [join(baseGoModPath, mainModGoFileName)];
await go.build(src, destPath);
} catch (err) {
console.log('failed to `go build`');
throw err;
}
} else {
// legacy mode
// we need `main.go` in the same dir as the entrypoint,
// otherwise `go build` will refuse to build
const go = await createGo(
workPath,
goPath,
process.platform,
process.arch,
{
cwd: entrypointDirname,
},
false
);
const origianlMainGoContents = await readFile(
join(__dirname, 'main.go'),
'utf8'
);
const mainGoContents = origianlMainGoContents.replace(
'__VC_HANDLER_FUNC_NAME',
handlerFunctionName
);
// in order to allow the user to have `main.go`,
// we need our `main.go` to be called something else
const mainGoFileName = 'main__vc__go__.go';
// Go doesn't like to build files in different directories,
// so now we place `main.go` together with the user code
await writeFile(join(entrypointDirname, mainGoFileName), mainGoContents);
undoFileActions.push({
to: undefined, // delete
from: join(entrypointDirname, mainGoFileName),
});
// `go get` will look at `*.go` (note we set `cwd`), parse the `import`s
// and download any packages that aren't part of the stdlib
debug('Running `go get`...');
try {
await go.get();
} catch (err) {
console.log('Failed to `go get`');
throw err;
}
debug('Running `go build`...');
const destPath = join(outDir, handlerFileName);
try {
const src = [
join(entrypointDirname, mainGoFileName),
entrypointAbsolute,
].map(file => normalize(file));
await go.build(src, destPath);
} catch (err) {
console.log('failed to `go build`');
throw err;
}
}
const mainModGoFileName = 'main__mod__.go';
const modMainGoContents = await readFile(
join(__dirname, mainModGoFileName),
'utf8'
);
const lambda = await createLambda({
files: { ...(await glob('**', outDir)), ...includedFiles },
handler: handlerFileName,
runtime: 'go1.x',
environment: {},
});
let goPackageName = `${packageName}/${packageName}`;
const goFuncName = `${packageName}.${handlerFunctionName}`;
if (isGoModExist) {
const goModContents = await readFile(join(goModPath, 'go.mod'), 'utf8');
const usrModName = goModContents.split('\n')[0].split(' ')[1];
if (entrypointArr.length > 1 && isGoModInRootDir) {
const cleanPackagePath = [...entrypointArr];
cleanPackagePath.pop();
goPackageName = `${usrModName}/${cleanPackagePath.join('/')}`;
} else {
goPackageName = `${usrModName}/${packageName}`;
}
const watch = parsedAnalyzed.watch;
let watchSub: string[] = [];
// if `entrypoint` located in subdirectory
// we will need to concat it with return watch array
if (entrypointArr.length > 1) {
entrypointArr.pop();
watchSub = parsedAnalyzed.watch.map(file => join(...entrypointArr, file));
}
const mainModGoContents = modMainGoContents
.replace('__VC_HANDLER_PACKAGE_NAME', goPackageName)
.replace('__VC_HANDLER_FUNC_NAME', goFuncName);
return {
output: lambda,
watch: watch.concat(watchSub),
};
} catch (error) {
debug('Go Builder Error: ' + error);
if (isGoModExist && isGoModInRootDir) {
debug('[mod-root] Write main file to ' + downloadPath);
await writeFile(join(downloadPath, mainModGoFileName), mainModGoContents);
} else if (isGoModExist && !isGoModInRootDir) {
debug('[mod-other] Write main file to ' + goModPath);
await writeFile(join(goModPath, mainModGoFileName), mainModGoContents);
throw error;
} finally {
try {
await cleanupFileSystem(undoFileActions, undoDirectoryCreation);
} catch (error) {
console.log(`Build cleanup failed: ${error.message}`);
debug('Cleanup Error: ' + error);
}
}
}
async function cleanupFileSystem(
undoFileActions: UndoFileAction[],
undoDirectoryCreation: string[]
) {
// we have to undo the actions in reverse order in cases
// where one file was moved multiple times, which happens
// using files that start with brackets
for (const action of undoFileActions.reverse()) {
if (action.to) {
await move(action.from, action.to);
} else {
debug('[entrypoint] Write main file to ' + entrypointDirname);
await writeFile(
join(entrypointDirname, mainModGoFileName),
mainModGoContents
);
}
// move user go file to folder
try {
// default path
let finalDestination = join(entrypointDirname, packageName, entrypoint);
// if `entrypoint` include folder, only use filename
if (entrypointArr.length > 1) {
finalDestination = join(
entrypointDirname,
packageName,
entrypointArr[entrypointArr.length - 1]
);
}
if (
dirname(downloadedFiles[entrypoint].fsPath) === goModPath ||
!isGoModExist
) {
await move(downloadedFiles[entrypoint].fsPath, finalDestination, {
overwrite: forceMove,
});
}
} catch (err) {
console.log('Failed to move entry to package folder');
throw err;
}
let baseGoModPath = '';
if (isGoModExist && isGoModInRootDir) {
baseGoModPath = downloadPath;
} else if (isGoModExist && !isGoModInRootDir) {
baseGoModPath = goModPath;
} else {
baseGoModPath = entrypointDirname;
}
if (meta.isDev) {
const isGoModBk = await pathExists(join(baseGoModPath, 'go.mod.bk'));
if (isGoModBk) {
await move(
join(baseGoModPath, 'go.mod.bk'),
join(baseGoModPath, 'go.mod'),
{ overwrite: forceMove }
);
await move(
join(baseGoModPath, 'go.sum.bk'),
join(baseGoModPath, 'go.sum'),
{ overwrite: forceMove }
);
}
}
debug('Tidy `go.mod` file...');
try {
// ensure go.mod up-to-date
await go.mod();
} catch (err) {
console.log('failed to `go mod tidy`');
throw err;
}
debug('Running `go build`...');
const destPath = join(outDir, handlerFileName);
try {
const src = [join(baseGoModPath, mainModGoFileName)];
await go.build(src, destPath);
} catch (err) {
console.log('failed to `go build`');
throw err;
}
if (meta.isDev) {
// caching for `vercel dev`
await move(
join(baseGoModPath, 'go.mod'),
join(baseGoModPath, 'go.mod.bk'),
{ overwrite: forceMove }
);
await move(
join(baseGoModPath, 'go.sum'),
join(baseGoModPath, 'go.sum.bk'),
{ overwrite: forceMove }
);
}
} else {
// legacy mode
// we need `main.go` in the same dir as the entrypoint,
// otherwise `go build` will refuse to build
const go = await createGo(
workPath,
goPath,
process.platform,
process.arch,
{
cwd: entrypointDirname,
},
false
);
const origianlMainGoContents = await readFile(
join(__dirname, 'main.go'),
'utf8'
);
const mainGoContents = origianlMainGoContents.replace(
'__VC_HANDLER_FUNC_NAME',
handlerFunctionName
);
// in order to allow the user to have `main.go`,
// we need our `main.go` to be called something else
const mainGoFileName = 'main__vc__go__.go';
// Go doesn't like to build files in different directories,
// so now we place `main.go` together with the user code
await writeFile(join(entrypointDirname, mainGoFileName), mainGoContents);
// `go get` will look at `*.go` (note we set `cwd`), parse the `import`s
// and download any packages that aren't part of the stdlib
debug('Running `go get`...');
try {
await go.get();
} catch (err) {
console.log('Failed to `go get`');
throw err;
}
debug('Running `go build`...');
const destPath = join(outDir, handlerFileName);
try {
const src = [
join(entrypointDirname, mainGoFileName),
downloadedFiles[entrypoint].fsPath,
].map(file => normalize(file));
await go.build(src, destPath);
} catch (err) {
console.log('failed to `go build`');
throw err;
await remove(action.from);
}
}
const lambda = await createLambda({
files: { ...(await glob('**', outDir)), ...includedFiles },
handler: handlerFileName,
runtime: 'go1.x',
environment: {},
const undoDirectoryPromises = undoDirectoryCreation.map(async directory => {
const contents = await readdir(directory);
// only delete an empty directory
// if it has contents, either something went wrong during cleanup or this
// directory contains project source code that should not be deleted
if (!contents.length) {
return rmdir(directory);
}
return undefined;
});
await Promise.all(undoDirectoryPromises);
}
const watch = parsedAnalyzed.watch;
let watchSub: string[] = [];
// if `entrypoint` located in subdirectory
// we will need to concat it with return watch array
if (entrypointArr.length > 1) {
entrypointArr.pop();
watchSub = parsedAnalyzed.watch.map(file => join(...entrypointArr, file));
async function findGoModPath(workPath: string): Promise<string> {
let checkPath = join(workPath, 'go.mod');
if (await pathExists(checkPath)) {
return checkPath;
}
return {
output: lambda,
watch: watch.concat(watchSub),
};
checkPath = join(workPath, 'api/go.mod');
if (await pathExists(checkPath)) {
return checkPath;
}
return '';
}
function isPortInfo(v: any): v is PortInfo {

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/go",
"version": "2.0.10",
"version": "2.0.15",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/go",
@@ -11,9 +11,8 @@
},
"scripts": {
"build": "node build",
"test": "yarn jest --env node --verbose --runInBand --bail",
"test-integration-once": "yarn test",
"prepublishOnly": "node build"
"test": "jest --env node --verbose --runInBand --bail",
"test-integration-once": "yarn test"
},
"files": [
"dist"
@@ -25,8 +24,8 @@
"@types/fs-extra": "^5.0.5",
"@types/node-fetch": "^2.3.0",
"@types/tar": "^4.0.0",
"@vercel/build-utils": "5.0.6",
"@vercel/ncc": "0.34.0",
"@vercel/build-utils": "5.2.0",
"@vercel/ncc": "0.24.0",
"async-retry": "1.3.1",
"execa": "^1.0.0",
"fs-extra": "^7.0.0",

View File

@@ -7,7 +7,7 @@ import (
)
// Handler function
func Handler(w http.ResponseWriter, r *http.Request) {
func HandlerAnother(w http.ResponseWriter, r *http.Request) {
bts, err := ioutil.ReadFile("templates/another.txt")
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)

View File

@@ -6,5 +6,5 @@ import (
)
func Handler(w http.ResponseWriter, r *http.Request) {
fmt.Fprintf(w, "hello:RANDOMNESS_PLACEHOLDER")
fmt.Fprintf(w, "Req Path: %s", r.URL.Path)
}

View File

@@ -1,10 +1,13 @@
{
"version": 2,
"builds": [{ "src": "api/**/*.go", "use": "@vercel/go" }],
"probes": [
{
"path": "/api/[hello].go",
"mustContain": "hello:RANDOMNESS_PLACEHOLDER"
"mustContain": "Req Path: /api/[hello].go"
},
{
"path": "/api/whatever",
"mustContain": "Req Path: /api/whatever"
},
{ "path": "/api/sub/[hi].go", "mustContain": "hi:RANDOMNESS_PLACEHOLDER" }
]

View File

@@ -7,15 +7,10 @@ const {
jest.setTimeout(4 * 60 * 1000);
const skipFixtures = ['08-include-files'];
const fixturesPath = path.resolve(__dirname, 'fixtures');
// eslint-disable-next-line no-restricted-syntax
for (const fixture of fs.readdirSync(fixturesPath)) {
if (skipFixtures.includes(fixture)) {
console.log(`Skipping test fixture ${fixture}`);
continue;
}
// eslint-disable-next-line no-loop-func
it(`should build ${fixture}`, async () => {
await expect(

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/hydrogen",
"version": "0.0.7",
"version": "0.0.12",
"license": "MIT",
"main": "./dist/index.js",
"homepage": "https://vercel.com/docs",
@@ -12,8 +12,7 @@
"scripts": {
"build": "node build.js",
"test-integration-once": "yarn test test/test.js",
"test": "jest --env node --verbose --bail --runInBand",
"prepublishOnly": "node build.js"
"test": "jest --env node --verbose --bail --runInBand"
},
"files": [
"dist",
@@ -22,7 +21,7 @@
"devDependencies": {
"@types/jest": "27.5.1",
"@types/node": "*",
"@vercel/build-utils": "5.0.6",
"@vercel/build-utils": "5.2.0",
"typescript": "4.6.4"
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@vercel/next",
"version": "3.1.10",
"version": "3.1.15",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://vercel.com/docs/runtimes#official-runtimes/next-js",
@@ -11,8 +11,7 @@
"test-unit": "yarn test test/build.test.ts test/unit/",
"test-next-local": "jest --env node --verbose --bail --forceExit --testTimeout=360000 test/integration/*.test.js test/integration/*.test.ts",
"test-next-local:middleware": "jest --env node --verbose --bail --useStderr --testTimeout=360000 test/integration/middleware.test.ts",
"test-integration-once": "rm test/builder-info.json; jest --env node --verbose --runInBand --bail test/fixtures/**/*.test.js",
"prepublishOnly": "yarn build"
"test-integration-once": "rm test/builder-info.json; jest --env node --verbose --runInBand --bail test/fixtures/**/*.test.js"
},
"repository": {
"type": "git",
@@ -45,9 +44,9 @@
"@types/semver": "6.0.0",
"@types/text-table": "0.2.1",
"@types/webpack-sources": "3.2.0",
"@vercel/build-utils": "5.0.6",
"@vercel/build-utils": "5.2.0",
"@vercel/nft": "0.21.0",
"@vercel/routing-utils": "2.0.0",
"@vercel/routing-utils": "2.0.2",
"async-sema": "3.0.1",
"buffer-crc32": "0.2.13",
"cheerio": "1.0.0-rc.10",

View File

@@ -151,6 +151,7 @@ export async function serverBuild({
nextVersion,
CORRECTED_MANIFESTS_VERSION
);
let hasStatic500 = !!staticPages[path.join(entryDirectory, '500')];
if (lambdaPageKeys.length === 0) {
@@ -1209,10 +1210,6 @@ export async function serverBuild({
]
: []),
// ensure prerender's for notFound: true static routes
// have 404 status code when not in preview mode
...notFoundPreviewRoutes,
...headers,
...redirects,
@@ -1223,6 +1220,10 @@ export async function serverBuild({
...beforeFilesRewrites,
// ensure prerender's for notFound: true static routes
// have 404 status code when not in preview mode
...notFoundPreviewRoutes,
// Make sure to 404 for the /404 path itself
...(i18n
? [
@@ -1328,7 +1329,8 @@ export async function serverBuild({
dest: '$0',
},
// remove locale prefixes to check public files
// remove locale prefixes to check public files and
// to allow checking non-prefixed lambda outputs
...(i18n
? [
{
@@ -1341,20 +1343,6 @@ export async function serverBuild({
]
: []),
// for non-shared lambdas remove locale prefix if present
// to allow checking for lambda
...(!i18n
? []
: [
{
src: `${path.join('/', entryDirectory, '/')}(?:${i18n?.locales
.map(locale => escapeStringRegexp(locale))
.join('|')})/(.*)`,
dest: '/$1',
check: true,
},
]),
// routes that are called after each rewrite or after routes
// if there no rewrites
{ handle: 'rewrite' },
@@ -1362,8 +1350,54 @@ export async function serverBuild({
// re-build /_next/data URL after resolving
...denormalizeNextDataRoute(),
...(isNextDataServerResolving
? dataRoutes.filter(route => {
// filter to only static data routes as dynamic routes will be handled
// below
const { pathname } = new URL(route.dest || '/', 'http://n');
return !isDynamicRoute(pathname.replace(/\.json$/, ''));
})
: []),
// /_next/data routes for getServerProps/getStaticProps pages
...dataRoutes,
...(isNextDataServerResolving
? // when resolving data routes for middleware we need to include
// all dynamic routes including non-SSG/SSP so that the priority
// is correct
dynamicRoutes
.map(route => {
route = Object.assign({}, route);
route.src = path.posix.join(
'^/',
entryDirectory,
'_next/data/',
escapedBuildId,
route.src.replace(/(^\^|\$$)/g, '') + '.json$'
);
const { pathname } = new URL(route.dest || '/', 'http://n');
let isPrerender = !!prerenders[path.join('./', pathname)];
if (routesManifest.i18n) {
for (const locale of routesManifest.i18n?.locales || []) {
const prerenderPathname = pathname.replace(
/^\/\$nextLocale/,
`/${locale}`
);
if (prerenders[path.join('./', prerenderPathname)]) {
isPrerender = true;
break;
}
}
}
if (isPrerender) {
route.dest = `/_next/data/${buildId}${pathname}.json`;
}
return route;
})
.filter(Boolean)
: dataRoutes),
...(!isNextDataServerResolving
? [
@@ -1395,6 +1429,7 @@ export async function serverBuild({
'x-nextjs-matched-path': '/$1',
},
continue: true,
override: true,
},
// add a catch-all data route so we don't 404 when getting
// middleware effects

View File

@@ -0,0 +1,8 @@
const path = require('path');
const { deployAndTest } = require('../../utils');
describe(`${__dirname.split(path.sep).pop()}`, () => {
it('should deploy and pass probe checks', async () => {
await deployAndTest(__dirname);
});
});

Some files were not shown because too many files have changed in this diff Show More