Compare commits

..

78 Commits

Author SHA1 Message Date
Joe Haddad
c1049985af Publish
- @now/bash@0.1.3-canary.0
 - @now/build-utils@0.4.37-canary.0
 - @now/next@0.0.85-canary.8
2019-02-26 14:33:47 -05:00
Nathan Rajlich
214388ccf3 [now-bash] Use base64 --decode flag
Instead of `-d`. Seems the short flag BSD's `base64` is `-D`,
however the long option name is consistent between both versions.
2019-02-26 11:28:45 -08:00
Joe Haddad
b1d6b7bfc0 feat(@now/next): optional assets directory (#240)
* Make assets part of the build output

* Add console output for user
2019-02-26 14:18:03 -05:00
Nathan Rajlich
ece3564dfd [now-build-utils] Use os.tmpdir() for getWritableDirectory() (#238)
`os.tmpdir()` abstracts away platform differences related to retrieving
the writable temp directory, for example MacOS returns a user-specific
temp directory instead of `/tmp`.

On AWS Lambda, [it returns `/tmp`](https://nexec-v2.n8.io/api/node?arg=-p&arg=require(%27os%27).tmpdir()).

More importantly, it removes the `__dirname/tmp` special-case when not
running in prod. This was problematic for `now dev` because the module
directory is not always writable (i.e. for a `pkg` binary).
2019-02-25 16:56:41 -08:00
Igor Klopov
a88af1f077 Publish
- @now/build-utils@0.4.36
2019-02-24 23:26:14 +03:00
Igor Klopov
d92f7b26c0 lint fixes for build-utils 2019-02-24 23:08:09 +03:00
Igor Klopov
52198af750 Publish
- @now/build-utils@0.4.35-canary.3
2019-02-24 11:56:01 +03:00
Igor Klopov
d58bff2453 [build-utils] chmod +x before running user script 2019-02-24 11:50:07 +03:00
Joe Haddad
8c0a144ae4 Publish
- @now/next@0.0.85-canary.7
2019-02-22 17:13:46 -05:00
Joe Haddad
106e4d5f36 Cache Next.js node_modules for faster rebuilds (#235)
* Add `node_modules` caching for Next.js projects

* Use existing work directory instead of redownloading dependencies

* Reorder fns

* Add prettier configuration

* Move entry files to cache folder

* Correct glob base path

* Be sure to include .yarn-integrity

* Save `package.json` and `yarn.lock`, too

* Increase compatibility so Yarn can dedupe
2019-02-22 16:30:29 -05:00
Nathan Rajlich
66c28bd695 Publish
- @now/next@0.0.85-canary.6
 - @now/node-server@0.5.0-canary.3
 - @now/node@0.5.0-canary.5
2019-02-21 13:20:49 -08:00
Nathan Rajlich
55e75296ff [now-next] Update @now/node-bridge to v1.0.0-canary.2 (#232)
Update `@now/next` to use the latest `@now/node-bridge`, which removes the hard-coded port 3000 that was previously in-place, which was problematic for `now dev`. Now the `http.Server` instance listens on an ephemeral port which is detected by the Now node runtime.

(Similar to 6101ba9d95 and 8dc0c92c58)
2019-02-21 12:55:33 +01:00
Nathan Rajlich
36cbb36737 Add --tags flag when determining if commit it tagged
This fixes matching for "unannotated tags".

See: https://stackoverflow.com/questions/1474115/how-to-find-the-tag-associated-with-a-given-git-commit#comment9135284_1474161
2019-02-20 18:11:11 -08:00
Nathan Rajlich
978ca328ef Publish
- @now/node@0.5.0-canary.4
 - @now/node-server@0.5.0-canary.2
2019-02-20 16:49:58 -08:00
Nathan Rajlich
7b383e0f7c [now-node] Fix build.sh script 2019-02-20 16:40:52 -08:00
Nathan Rajlich
faa5ab36aa Publish
- @now/node@0.5.0-canary.3
2019-02-20 16:32:31 -08:00
Nathan Rajlich
c0a21969dd Regenerate yarn.lock file
Not really sure why this is happening, since `@now/node` and
`@now/node-server` both use this version, but without this commit, then
publish step on CircleCI fails due to a dirty working tree.

Go figure 🤷
2019-02-20 16:31:04 -08:00
Nathan Rajlich
73d0a1723f [now-node] Ensure that @now/node-bridge is installed for build 2019-02-20 14:59:24 -08:00
Nathan Rajlich
7c515544ae [now-node] Don't commit the bridge.d.ts file 2019-02-20 14:52:28 -08:00
Nathan Rajlich
b53c9a6299 Publish
- @now/node@0.5.0-canary.2
2019-02-20 14:51:02 -08:00
Nathan Rajlich
35ff11e6e4 Remove prepublish step for now
It's causing these kinds of failures:

https://circleci.com/gh/zeit/now-builders/839
2019-02-20 14:50:18 -08:00
Nathan Rajlich
64ee4905cd Publish
- @now/node-server@0.5.0-canary.1
 - @now/node@0.5.0-canary.1
2019-02-20 13:53:37 -08:00
Nathan Rajlich
e50dd7e50a Bump now/node and now/node-server to v0.5.0 canary 2019-02-20 13:53:00 -08:00
Nathan Rajlich
6101ba9d95 [now-node-server] Update @now/node-bridge to v1.0.0-canary.2 (#231)
Update `@now/node-server` to use the latest `@now/node-bridge`, which
removes the hard-coded port 3000 that was previously in-place,
which was problematic for `now dev`. Now the `http.Server`
instance listens on an ephemeral port which is detected by the
Now node runtime.
2019-02-20 13:47:17 -08:00
Nathan Rajlich
8dc0c92c58 [now-node] Migrate to TypeScript and use @now/node-bridge v1 (#212)
This moves `@now/node` to being implemented in TypeScript.

It also updates it to use the latest `@now/node-bridge`, which
removes the hard-coded port 3000 that was previously in-place,
which was problematic for `now dev`. Now the `http.Server`
instance listens on an ephemeral port which is detected by the
Now node runtime.
2019-02-20 12:33:21 -08:00
Nathan Rajlich
44c9f3765a Publish
- @now/node-bridge@1.0.0-canary.2
 - @now/node-server@0.4.27-canary.7
 - @now/node@0.4.29-canary.7
2019-02-19 20:49:58 -08:00
Nathan Rajlich
92c05ca338 Pin @now/node-bridge to v0.1.11-canary.0
To address: https://github.com/zeit/now-builders/pull/224#issuecomment-465410905
2019-02-19 20:26:38 -08:00
Nathan Rajlich
069b557906 Publish
- @now/node-bridge@1.0.0-canary.1
2019-02-19 17:26:04 -08:00
Nathan Rajlich
692a0df909 Publish
- @now/node-bridge@0.1.11-canary.1
 - @now/node-server@0.4.27-canary.6
 - @now/node@0.4.29-canary.6
2019-02-19 16:33:18 -08:00
Nathan Rajlich
aeafeb5441 [now-node-bridge] Refactor API to be http.Server focused (#224)
* [now-node-bridge] Refactor API to be `http.Server` focused

This commit refactors the `@now/node-bridge` helper module to work with
`http.Server` instances directly, instead of expecting the `port` to be
set.

The `Bridge` instance calls the `listen()` function on the server instance
which binds to an ephemeral port. This is especially important for
`now dev`, where using a hard-coded port will cause port conflicts for
multiple lambdas using the same builder.

Also converts to TypeScript and adds some basic unit tests.

Example usage:

```js
const server = new Server(() => {});
const bridge = new Bridge(server);
bridge.listen();

const info = await bridge.listening;
assert.equal(info.address, '127.0.0.1');
assert.equal(typeof info.port, 'number');
```

* Update `yarn.lock`

* Add `pretest` script

* Enable TypeScript `strict` mode

* Throw if a string is returned from `server.address()`

Defensive programming ftw

Co-Authored-By: TooTallNate <n@n8.io>

* Prettier

* Prettier

* Fixes

* Attempt to fix CI

* Add `files` array to package.json

* Check for the `Action` property to avoid type casting

Co-Authored-By: TooTallNate <n@n8.io>

* Split the normalizing functions into separate ones

Also adds additional unit tests.

* export the `NowProxyRequest` and `NowProxyResponse`

* Remove last `as` casting

* Move some up

* Debug CircleCI tests :/

* Fix "Invoke" check

* Attempt to fix CircleCI again

* Fix bad

* Convert tests to use `jest`
2019-02-19 16:17:58 -08:00
Nathan Rajlich
a09d5fb355 Add publish.sh script for CircleCI (#228)
* Add `publish.sh` script for CircleCI

Centralize the publishing logic for CircleCI to use.

There was a bug in the previous "Potentially publish stable release"
branch, such that it would be executed for non-tagged commits. The new
publish script checks if there are indeed any tags for the commit, and
bails if there are none.

As an additional bonus, there's now only one publish step in the
CircleCI config, because the publish script determines stable vs. canary
based on the tag name.

* Remove `echo`
2019-02-19 14:45:04 -08:00
Nathan Rajlich
d8017aa9aa Publish
- @now/python@0.0.41-canary.2
2019-02-19 12:14:49 -08:00
Honza Javorek
702f56b9b5 [now-python] Add missing base64 import and fix coding style (#226) 2019-02-19 12:11:12 -08:00
Nathan Rajlich
183b117152 Publish
- @now/python@0.0.41-canary.1
2019-02-19 11:29:26 -08:00
Nathaniel Hill
75b3fb4981 [now-python] Pass the request body to the HTTP request (#99)
* pass body to lambda

* reformat

* look for requirements.txt in lambda root

* single issue PR

* support base64 encoded request body

* support local requirements.txt

* formatting cleanup

* Update packages/now-python/now_handler.py

Co-Authored-By: NathanielHill <nata@goguna.com>

* Avoid error on empty POST body

Just realized I forgot to address the empty POST body error. This should do it

* quick formatting fix

remove double quotes for consistency
2019-02-19 11:28:21 -08:00
Steven
49e63de5fe Publish
- @now/next@0.0.85-canary.5
 - @now/node-server@0.4.27-canary.5
 - @now/node@0.4.29-canary.5
2019-02-17 19:09:27 -05:00
Steven
4742cd32f2 [now-node] add sourceMap flag to ncc (#222)
This adds a {sourceMap:true} flag to ncc builds
2019-02-17 19:08:19 -05:00
Steven
377b73105d [now-node] bump ncc to 0.15.2 (#221)
This has a couple fixes.

See release notes here https://github.com/zeit/ncc/releases/tag/0.15.2
2019-02-17 23:55:55 +01:00
Connor Davis
a5577efb3d Prepublish @now/next 2019-02-15 17:07:14 -06:00
Steven
2ec46dc5c9 Publish
- @now/node-server@0.4.27-canary.4
 - @now/node@0.4.29-canary.4
 - @now/rust@0.0.3-canary.2
2019-02-15 12:24:06 -05:00
Steven
42708ed93c [now-node] bump ncc to 0.15.1 (#219)
This fixes a regression introduced in ncc@0.13.2 and also prints the version of ncc used at build time.

See [release notes](https://github.com/zeit/ncc/releases/tag/0.15.1) for more info.
2019-02-15 17:48:58 +01:00
Antonio Nuno Monteiro
2fabe95f6e [now-rust] Build .rs entrypoints (#213) 2019-02-15 15:10:51 +00:00
Nathan Rajlich
ac1a3dab22 Publish
- @now/build-utils@0.4.35-canary.2
 - @now/go@0.2.13-canary.1
 - @now/rust@0.0.3-canary.1
2019-02-14 17:45:31 -08:00
Nathan Rajlich
ad4011512d [now-go] Refactor to be platform-aware for now dev (#216)
* [now-go] Refactor to be platform-aware for `now dev`

This is mostly to have `now dev` work, but also just a general cleanup
of the builder codebase.

Previously, the `get-exported-function-name` Go binary was being
compiled at publish-time for "linux x64" architecture, since that
is what AWS Lambda expects.

In order for `@now/go` to work on MacOS (or any other platform not
compatible with "linux x64") the helper program must be compiled by the
actual host machine, so that it builds a MacOS binary for example.
So building `get-exported-function-name` has been moved from the
"publish" step to the "postinstall" step.

Similarly, the `go` binary itself used to be downloaded explicitly by
the `build()` function during build-time. This step has been moved to
the "postinstall" script as well, so that said binary may be compiled.

This also makes sense because the `@now/go` builder will be downloaded
by `now dev` once, so this way the `go` binary only needs to be
downloaded once as well, rather than once per build.

Any assumptions about `GOOS` and `GOARCH` have also been removed.

* Don't lint the `go` tarball directory

* Use GitHub URL for the go bridge

* Tweak comments

* Use `glob()` instead of `FileFsRef`

`FileFsRef` appears to lose executable permissions on the file,
whereas `glob()` works as expected.
2019-02-14 17:36:33 -08:00
Nathan Rajlich
9ff1a25c8f [now-build-utils] Add FsFileRef.fromFsPath() helper function (#217)
* [now-build-utils] Add `FsFileRef.fromFile()` helper function

This helper function is very similar to the raw `FsFileRef` constructor,
however it stats the file to retrieve the proper `mode`, so that
settings like executable bits are not lost.

* Rename to `FileFsRef.fromFsPath()`
2019-02-14 17:09:01 -08:00
Antonio Nuno Monteiro
8039b3d377 [now-rust] Publish 0.1.2 to crates.io (#215) 2019-02-14 15:03:14 -08:00
Connor Davis
dd9017475c Publish
- @now/lambda@0.4.10-canary.1
 - @now/md@0.4.10-canary.1
 - @now/mdx-deck@0.4.19-canary.1
 - @now/next@0.0.85-canary.4
 - @now/node-server@0.4.27-canary.3
 - @now/node@0.4.29-canary.3
 - @now/php@0.4.14-canary.1
 - @now/static-build@0.4.19-canary.1
2019-02-12 21:41:52 -06:00
Connor Davis
031499014f Give tests 4 minutes instead of 2 2019-02-12 21:40:51 -06:00
Connor Davis
2a68d2a2ad Publish
- @now/build-utils@0.4.35-canary.1
 - @now/next@0.0.85-canary.3
 - @now/rust@0.0.3-canary.0
2019-02-12 21:00:19 -06:00
Connor Davis
31299fae6e Force canary bump in @now/next 2019-02-12 20:59:42 -06:00
Connor Davis
4bac0db379 fix @now/build-utils publishing tmp/ 2019-02-12 20:35:12 -06:00
Connor Davis
95e7d459d3 Increase timeout on now-build-utils 2019-02-12 19:39:15 -06:00
Antonio Nuno Monteiro
dd120b8d20 [now-rist] Implement simpler API for *.rs entrypoints (#210) 2019-02-12 13:21:03 -08:00
Mike Engel
b6975676e5 [now-rust] Import Deserialize traits behind de and ser (#203)
See https://github.com/serde-rs/serde/issues/1441
2019-02-12 13:16:14 -08:00
Igor Klopov
a7951dae81 [tests] produce public deployments 2019-02-12 22:07:10 +03:00
Connor Davis
b0c918f7fb Publish
- @now/next@0.0.85-canary.2
2019-02-11 20:28:52 -06:00
Connor Davis
df54dc7dc9 Download Entire Directory when Building now-next (#200) 2019-02-11 20:23:26 -06:00
paulogdm
0dd801ff6c Update now-next-legacy-mode.md (#208)
* Update now-next-legacy-mode.md

From `canary` to `latest`.

* Update now-next-no-serverless-pages-built.md

From `canary` to `latest`

* Apply suggestions from code review

Better words...

Co-Authored-By: paulogdm <paulogdemitri@gmail.com>
2019-02-11 14:41:42 +01:00
Connor Davis
398743ef95 Fix tests not completing 2019-02-08 20:03:32 -06:00
Nathan Rajlich
337c74b81b Publish
- @now/bash@0.1.2
2019-02-08 15:28:30 -08:00
Nathan Rajlich
680bb82ec3 [now-bash] Use mktemp dry run syntax that is compatible with BSD (#204)
For `now dev` running on a MacOS machine, some of our builders will need
to be adjusted to be "platform-aware".

In this case, `mktemp` is provided by BSD instead of GNU, which doesn't
support the long form `--dry-run` flag, however both support the `-u`
flag which does the same thing.
2019-02-08 15:25:51 -08:00
Nathan Rajlich
17ed5411e3 Publish
- @now/rust@0.0.2
2019-02-06 12:58:51 -08:00
Nathan Rajlich
d9bbcb6939 Publish
- @now/rust@0.0.2-canary.2
 - @now/static-build@0.4.19-canary.0
2019-02-05 12:06:03 -08:00
Igor Klopov
800e4de76f [tests] lint fixes 2019-02-05 22:35:18 +03:00
António Nuno Monteiro
864dd468d9 Place generated @now/rust lambdas in the location that matches their entrypoint (#198) 2019-02-05 11:31:41 -08:00
Igor Klopov
ba833871bb [tests] merge build.env from now.json 2019-02-05 22:15:02 +03:00
Arunoda Susiripal
e732bac78e @now/static-build@0.4.18 2019-02-05 02:27:26 +00:00
Arunoda Susiripal
28ea4015b4 Publish
- @now/static-build@0.4.18-canary.3
2019-02-05 02:12:15 +00:00
Arunoda Susiripal
a93d97cabd Use proper test code for now-static-build 2019-02-05 02:10:25 +00:00
Arunoda Susiripal
67f39f7c9b Publish
- @now/static-build@0.4.18-canary.2
2019-02-05 01:58:26 +00:00
Arunoda Susiripal
acd793b9e9 Fix a typo in a message of now-static-build 2019-02-05 01:57:27 +00:00
Arunoda Susiripal
f74d61279d Publish
- @now/static-build@0.4.18-canary.1
2019-02-05 01:50:45 +00:00
Arunoda Susiripala
fcb8eacec0 Check for distDir existence inside now-static-build (#197)
* Throw in case we don't see the distDir

* Add test cases.
2019-02-05 07:18:30 +05:30
Nathan Rajlich
c8fca2ba72 Publish
- @now/build-utils@0.4.35-canary.0
 - @now/rust@0.0.2-canary.1
2019-02-04 14:44:28 -08:00
António Nuno Monteiro
4feffa13eb Fix @now/rust for nested paths (#194)
* Fix `@now/rust` for nested paths

* fix bug

* fs-extra

* no need for mkdirp or rimraf anymore
2019-02-04 14:40:41 -08:00
Igor Klopov
3e330b25f4 Publish 2019-02-04 21:38:37 +03:00
Igor Klopov
9b2cae33af [build-utils] removed in-memory yarn cache 2019-02-04 19:45:46 +03:00
António Nuno Monteiro
4b6371530c Publish initial version to crates.io (#193) 2019-02-01 00:11:45 +01:00
89 changed files with 2086 additions and 2629 deletions

12
.circleci/build.sh Executable file
View File

@@ -0,0 +1,12 @@
#!/bin/bash
set -euo pipefail
circleci_dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
needs_build="$(grep -rn '"build"' packages/*/package.json | cut -d: -f1)"
for pkg in $needs_build; do
dir="$(dirname "$pkg")"
cd "$circleci_dir/../$dir"
echo "Building \`$dir\`"
yarn build
done

View File

@@ -23,6 +23,9 @@ jobs:
- run:
name: Linting
command: yarn lint
- run:
name: Building
command: ./.circleci/build.sh
- run:
name: Tests
command: yarn test
@@ -30,11 +33,8 @@ jobs:
name: Potentially save npm token
command: "([[ ! -z $NPM_TOKEN ]] && echo \"//registry.npmjs.org/:_authToken=$NPM_TOKEN\" >> ~/.npmrc) || echo \"Did not write npm token\""
- run:
name: Potentially publish canary release
command: "if ls ~/.npmrc >/dev/null 2>&1 && [[ $(git describe --exact-match 2> /dev/null || :) =~ -canary ]]; then yarn run lerna publish from-git --npm-tag canary --yes; else echo \"Did not publish\"; fi"
- run:
name: Potentially publish stable release
command: "if ls ~/.npmrc >/dev/null 2>&1 && [[ ! $(git describe --exact-match 2> /dev/null || :) =~ -canary ]]; then yarn run lerna publish from-git --yes; else echo \"Did not publish\"; fi"
name: Potentially publish releases to npm
command: ./.circleci/publish.sh
workflows:
version: 2
build-and-deploy:

24
.circleci/publish.sh Executable file
View File

@@ -0,0 +1,24 @@
#!/bin/bash
set -euo pipefail
if [ ! -e ~/.npmrc ]; then
echo "~/.npmrc file does not exist, skipping publish"
exit 0
fi
npm_tag=""
tag="$(git describe --tags --exact-match 2> /dev/null || :)"
if [ -z "$tag" ]; then
echo "Not a tagged commit, skipping publish"
exit 0
fi
if [[ "$tag" =~ -canary ]]; then
echo "Publishing canary release"
npm_tag="--npm-tag canary"
else
echo "Publishing stable release"
fi
yarn run lerna publish from-git $npm_tag --yes

View File

@@ -1,3 +1,4 @@
/tmp/*
/node_modules/*
/**/node_modules/*
/packages/now-go/go/*

3
.gitignore vendored
View File

@@ -1,3 +1,4 @@
node_modules
tmp
target/
target/
.next

3
.prettierrc.json Normal file
View File

@@ -0,0 +1,3 @@
{
"singleQuote": true
}

View File

@@ -34,10 +34,10 @@ Serverless:
In order to create the smallest possible lambdas Next.js has to be configured to build for the `serverless` target.
1. Serverless Next.js requires Next.js 8 or later, currently this version is out on the `canary` release channel:
1. Serverless Next.js requires Next.js 8 or later, to upgrade you can install the `latest` version:
```
npm install next@canary
npm install next --save
```
2. Add the `now-build` script to your `package.json`

View File

@@ -8,10 +8,10 @@ This error occurs when you have your application is not configured for Serverles
In order to create the smallest possible lambdas Next.js has to be configured to build for the `serverless` target.
1. Serverless Next.js requires Next.js 8 or later, currently this version is out on the `canary` release channel:
1. Serverless Next.js requires Next.js 8 or later, to upgrade you can install the `latest` version:
```
npm install next@canary
npm install next --save
```
2. Add the `now-build` script to your `package.json`

View File

@@ -15,13 +15,13 @@
"publish-stable": "lerna version",
"publish-canary": "lerna version prerelease --preid canary",
"lint": "tsc && eslint .",
"test": "jest --runInBand",
"test": "jest --runInBand --verbose",
"lint-staged": "lint-staged"
},
"pre-commit": "lint-staged",
"lint-staged": {
"*.js": [
"prettier --write --single-quote",
"prettier --write",
"eslint --fix",
"git add"
]

View File

@@ -1,6 +1,6 @@
{
"name": "@now/bash",
"version": "0.1.2-canary.0",
"version": "0.1.3-canary.0",
"description": "Now 2.0 builder for HTTP endpoints written in Bash",
"main": "index.js",
"author": "Nathan Rajlich <nate@zeit.co>",

View File

@@ -52,7 +52,7 @@ _lambda_runtime_next() {
# Need to use a fifo here instead of bash <() because Lambda
# errors with "/dev/fd/63 not found" for some reason :/
local stdin
stdin="$(mktemp --dry-run)"
stdin="$(mktemp -u)"
mkfifo "$stdin"
_lambda_runtime_body "$event" > "$stdin" &
@@ -76,7 +76,7 @@ _lambda_runtime_next() {
_lambda_runtime_body() {
if [ "$(jq --raw-output '.body | type' < "$1")" = "string" ]; then
if [ "$(jq --raw-output '.encoding' < "$1")" = "base64" ]; then
jq --raw-output '.body' < "$1" | base64 -d
jq --raw-output '.body' < "$1" | base64 --decode
else
# assume plain-text body
jq --raw-output '.body' < "$1"

View File

@@ -1 +1,2 @@
/test
tmp

View File

@@ -26,6 +26,18 @@ class FileFsRef {
this.fsPath = fsPath;
}
/**
* Creates a `FileFsRef` with the correct `mode` from the file system.
*
* @argument {Object} options
* @argument {string} options.fsPath
* @returns {Promise<FileFsRef>}
*/
static async fromFsPath({ fsPath }) {
const { mode } = await fs.lstat(fsPath);
return new FileFsRef({ mode, fsPath });
}
/**
* @argument {Object} options
* @argument {number} [options.mode=0o100644]

View File

@@ -1,242 +0,0 @@
/* eslint-disable arrow-body-style,no-multi-assign,no-param-reassign */
const MemoryFileSystem = require('memory-fs');
const fs = require('fs');
const path = require('path');
const { spawnSync } = require('child_process');
const yarnPath = spawnSync('which', ['yarn'])
.stdout.toString()
.trim();
const cachePath = spawnSync(yarnPath, ['cache', 'dir'])
.stdout.toString()
.trim();
spawnSync(yarnPath, ['cache', 'clean']);
const vfs = new MemoryFileSystem();
function isInsideCachePath(filename) {
const relative = path.relative(cachePath, filename);
return !relative.startsWith('..');
}
function replaceFn(name, newFnFactory) {
const prevFn = fs[name];
fs[name] = newFnFactory(prevFn);
}
replaceFn('createWriteStream', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const stream = vfs.createWriteStream(...args);
stream.on('finish', () => {
setTimeout(() => {
stream.emit('close');
});
});
setTimeout(() => {
stream.emit('open');
});
return stream;
};
});
replaceFn('readFile', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return vfs.readFile(...args.slice(0, -1), (error, result) => {
if (error) {
return prevFn.call(fs, ...args);
}
return callback(error, result);
});
};
});
replaceFn('readdir', (prevFn) => {
return (...args) => {
const dirname = args[0];
if (!isInsideCachePath(dirname)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return prevFn.call(fs, dirname, (error, results) => {
if (error) {
results = [];
}
return vfs.readdir(dirname, (error2, results2) => {
if (error2) {
return callback(error2);
}
// eslint-disable-next-line no-restricted-syntax
for (const result2 of results2) {
if (!results.includes(result2)) {
results.push(result2);
}
}
return callback(error2, results);
});
});
};
});
replaceFn('stat', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return vfs.stat(...args.slice(0, -1), (error, result) => {
if (error) {
return prevFn.call(fs, ...args);
}
result.atime = result.mtime = new Date();
return callback(error, result);
});
};
});
replaceFn('lstat', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return vfs.stat(...args.slice(0, -1), (error, result) => {
if (error) {
return prevFn.call(fs, ...args);
}
result.atime = result.mtime = new Date();
return callback(error, result);
});
};
});
replaceFn('exists', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return vfs.exists(...args.slice(0, -1), (result) => {
if (!result) {
return prevFn.call(fs, ...args);
}
return callback(result);
});
};
});
replaceFn('copyFile', (prevFn) => {
return (...args) => {
const src = args[0];
const dest = args[1];
const callback = args[args.length - 1];
if (isInsideCachePath(src) && !isInsideCachePath(dest)) {
const buffer = vfs.readFileSync(src);
return fs.writeFile(dest, buffer, callback);
}
if (!isInsideCachePath(src) && isInsideCachePath(dest)) {
const buffer = fs.readFileSync(src);
return vfs.writeFile(dest, buffer, callback);
}
return prevFn.call(fs, ...args);
};
});
replaceFn('writeFile', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
return vfs.writeFile(...args);
};
});
replaceFn('mkdir', (prevFn) => {
return (...args) => {
const dirname = args[0];
if (!isInsideCachePath(dirname)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return prevFn.call(fs, dirname, (error) => {
if (error) {
return callback(error);
}
return vfs.mkdirp(dirname, callback);
});
};
});
replaceFn('utimes', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return setTimeout(callback, 0);
};
});
replaceFn('chmod', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return setTimeout(callback, 0);
};
});
replaceFn('chown', (prevFn) => {
return (...args) => {
const filename = args[0];
if (!isInsideCachePath(filename)) {
return prevFn.call(fs, ...args);
}
const callback = args[args.length - 1];
return setTimeout(callback, 0);
};
});
require(yarnPath);

View File

@@ -1,12 +1,10 @@
const path = require('path');
const fs = require('fs-extra');
const prod = process.env.AWS_EXECUTION_ENV || process.env.X_GOOGLE_CODE_LOCATION;
const TMP_PATH = prod ? '/tmp' : path.join(__dirname, 'tmp');
const { join } = require('path');
const { tmpdir } = require('os');
const { mkdirp } = require('fs-extra');
module.exports = async function getWritableDirectory() {
const name = Math.floor(Math.random() * 0x7fffffff).toString(16);
const directory = path.join(TMP_PATH, name);
await fs.mkdirp(directory);
const directory = join(tmpdir(), name);
await mkdirp(directory);
return directory;
};

View File

@@ -3,9 +3,6 @@ const fs = require('fs-extra');
const path = require('path');
const { spawn } = require('child_process');
const prod = process.env.AWS_EXECUTION_ENV
|| process.env.X_GOOGLE_CODE_LOCATION;
function spawnAsync(command, args, cwd) {
return new Promise((resolve, reject) => {
const child = spawn(command, args, { stdio: 'inherit', cwd });
@@ -16,9 +13,18 @@ function spawnAsync(command, args, cwd) {
});
}
async function chmodPlusX(fsPath) {
const s = await fs.stat(fsPath);
const newMode = s.mode | 64 | 8 | 1; // eslint-disable-line no-bitwise
if (s.mode === newMode) return;
const base8 = newMode.toString(8).slice(-3);
await fs.chmod(fsPath, base8);
}
async function runShellScript(fsPath) {
assert(path.isAbsolute(fsPath));
const destPath = path.dirname(fsPath);
await chmodPlusX(fsPath);
await spawnAsync(`./${path.basename(fsPath)}`, [], destPath);
return true;
}
@@ -66,15 +72,6 @@ async function installDependencies(destPath, args = []) {
commandArgs = args.filter(a => a !== '--prefer-offline');
await spawnAsync('npm', ['install'].concat(commandArgs), destPath);
await spawnAsync('npm', ['cache', 'clean', '--force'], destPath);
} else if (prod) {
console.log('using memory-fs for yarn cache');
await spawnAsync(
'node',
[path.join(__dirname, 'bootstrap-yarn.js'), '--cwd', destPath].concat(
commandArgs,
),
destPath,
);
} else {
await spawnAsync('yarn', ['--cwd', destPath].concat(commandArgs), destPath);
await spawnAsync('yarn', ['cache', 'clean'], destPath);

View File

@@ -1,6 +1,6 @@
{
"name": "@now/build-utils",
"version": "0.4.33-canary.2",
"version": "0.4.37-canary.0",
"license": "MIT",
"repository": {
"type": "git",

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const builderUrl = '@canary';
let buildUtilsUrl;

View File

@@ -1,4 +1,5 @@
node_modules
*.log
launcher
bin
/?.js
/go
/get-exported-function-name

View File

@@ -1,7 +0,0 @@
#!/usr/bin/env bash
mkdir -p bin
cd util
GOOS=linux GOARCH=amd64 go build get-exported-function-name.go
mv get-exported-function-name ../bin/

View File

@@ -1,23 +0,0 @@
const path = require('path');
const fetch = require('node-fetch');
const tar = require('tar');
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js'); // eslint-disable-line import/no-extraneous-dependencies
const url = 'https://dl.google.com/go/go1.11.1.linux-amd64.tar.gz';
module.exports = async () => {
const res = await fetch(url);
const dir = await getWritableDirectory();
if (!res.ok) {
throw new Error(`Failed to download: ${url}`);
}
return new Promise((resolve, reject) => {
res.body
.on('error', reject)
.pipe(tar.extract({ cwd: dir, strip: 1 }))
.on('finish', () => resolve(path.join(dir, 'bin', 'go')));
});
};

View File

@@ -0,0 +1,124 @@
const tar = require('tar');
const execa = require('execa');
const fetch = require('node-fetch');
const { mkdirp } = require('fs-extra');
const { dirname, join } = require('path');
const debug = require('debug')('@now/go:go-helpers');
const archMap = new Map([['x64', 'amd64'], ['x86', '386']]);
const platformMap = new Map([['win32', 'windows']]);
// Location where the `go` binary will be installed after `postinstall`
const GO_DIR = join(__dirname, 'go');
const GO_BIN = join(GO_DIR, 'bin/go');
const getPlatform = p => platformMap.get(p) || p;
const getArch = a => archMap.get(a) || a;
const getGoUrl = (version, platform, arch) => {
const goArch = getArch(arch);
const goPlatform = getPlatform(platform);
const ext = platform === 'win32' ? 'zip' : 'tar.gz';
return `https://dl.google.com/go/go${version}.${goPlatform}-${goArch}.${ext}`;
};
function getExportedFunctionName(filePath) {
debug('Detecting handler name for %o', filePath);
const bin = join(__dirname, 'get-exported-function-name');
const args = [filePath];
const name = execa.stdout(bin, args);
debug('Detected exported name %o', filePath);
return name;
}
// Creates a `$GOPATH` directory tree, as per `go help gopath` instructions.
// Without this, `go` won't recognize the `$GOPATH`.
function createGoPathTree(goPath, platform, arch) {
const tuple = `${getPlatform(platform)}_${getArch(arch)}`;
debug('Creating GOPATH directory structure for %o (%s)', goPath, tuple);
return Promise.all([
mkdirp(join(goPath, 'bin')),
mkdirp(join(goPath, 'pkg', tuple)),
]);
}
async function get({ src } = {}) {
const args = ['get'];
if (src) {
debug('Fetching `go` dependencies for file %o', src);
args.push(src);
} else {
debug('Fetching `go` dependencies for cwd %o', this.cwd);
}
await this(...args);
}
async function build({ src, dest }) {
debug('Building `go` binary %o -> %o', src, dest);
let sources;
if (Array.isArray(src)) {
sources = src;
} else {
sources = [src];
}
await this('build', '-o', dest, ...sources);
}
async function createGo(
goPath,
platform = process.platform,
arch = process.arch,
opts = {},
) {
const env = {
...process.env,
PATH: `${dirname(GO_BIN)}:${process.env.PATH}`,
GOPATH: goPath,
...opts.env,
};
function go(...args) {
debug('Exec %o', `go ${args.join(' ')}`);
return execa('go', args, { stdio: 'inherit', ...opts, env });
}
go.cwd = opts.cwd || process.cwd();
go.get = get;
go.build = build;
go.goPath = goPath;
await createGoPathTree(goPath, platform, arch);
return go;
}
async function downloadGo(
dir = GO_DIR,
version = '1.11.5',
platform = process.platform,
arch = process.arch,
) {
debug('Installing `go` v%s to %o for %s %s', version, dir, platform, arch);
const url = getGoUrl(version, platform, arch);
debug('Downloading `go` URL: %o', url);
const res = await fetch(url);
if (!res.ok) {
throw new Error(`Failed to download: ${url} (${res.status})`);
}
// TODO: use a zip extractor when `ext === "zip"`
await mkdirp(dir);
await new Promise((resolve, reject) => {
res.body
.on('error', reject)
.pipe(tar.extract({ cwd: dir, strip: 1 }))
.on('error', reject)
.on('finish', resolve);
});
return createGo(dir, platform, arch);
}
module.exports = {
createGo,
downloadGo,
getExportedFunctionName,
};

View File

@@ -1,79 +1,52 @@
const path = require('path');
const { mkdirp, readFile, writeFile } = require('fs-extra');
const { join, dirname } = require('path');
const { readFile, writeFile } = require('fs-extra');
const execa = require('execa');
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js'); // eslint-disable-line import/no-extraneous-dependencies
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
const downloadGit = require('lambda-git');
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
const downloadGoBin = require('./download-go-bin');
const { createGo, getExportedFunctionName } = require('./go-helpers');
// creates a `$GOPATH` directory tree, as per
// `go help gopath`'s instructions.
// without this, Go won't recognize the `$GOPATH`
async function createGoPathTree(goPath) {
await mkdirp(path.join(goPath, 'bin'));
await mkdirp(path.join(goPath, 'pkg', 'linux_amd64'));
}
exports.config = {
const config = {
maxLambdaSize: '10mb',
};
exports.build = async ({ files, entrypoint }) => {
console.log('downloading files...');
async function build({ files, entrypoint }) {
console.log('Downloading user files...');
const gitPath = await getWritableDirectory();
const goPath = await getWritableDirectory();
const srcPath = path.join(goPath, 'src', 'lambda');
const outDir = await getWritableDirectory();
await createGoPathTree(goPath);
const [goPath, outDir] = await Promise.all([
getWritableDirectory(),
getWritableDirectory(),
]);
const srcPath = join(goPath, 'src', 'lambda');
const downloadedFiles = await download(files, srcPath);
console.log('downloading go binary...');
const goBin = await downloadGoBin();
console.log('downloading git binary...');
// downloads a git binary that works on Amazon Linux and sets
// `process.env.GIT_EXEC_PATH` so `go(1)` can see it
await downloadGit({ targetDirectory: gitPath });
const goEnv = {
...process.env,
GOOS: 'linux',
GOARCH: 'amd64',
GOPATH: goPath,
};
console.log(`parsing AST for "${entrypoint}"`);
let handlerFunctionName = '';
console.log(`Parsing AST for "${entrypoint}"`);
let handlerFunctionName;
try {
handlerFunctionName = await execa.stdout(
path.join(__dirname, 'bin', 'get-exported-function-name'),
[downloadedFiles[entrypoint].fsPath],
handlerFunctionName = await getExportedFunctionName(
downloadedFiles[entrypoint].fsPath,
);
} catch (err) {
console.log(`failed to parse AST for "${entrypoint}"`);
console.log(`Failed to parse AST for "${entrypoint}"`);
throw err;
}
if (handlerFunctionName === '') {
const e = new Error(
`Could not find an exported function on "${entrypoint}"`,
if (!handlerFunctionName) {
const err = new Error(
`Could not find an exported function in "${entrypoint}"`,
);
console.log(e.message);
throw e;
console.log(err.message);
throw err;
}
console.log(
`Found exported function "${handlerFunctionName}" on "${entrypoint}"`,
`Found exported function "${handlerFunctionName}" in "${entrypoint}"`,
);
const origianlMainGoContents = await readFile(
path.join(__dirname, 'main.go'),
join(__dirname, 'main.go'),
'utf8',
);
const mainGoContents = origianlMainGoContents.replace(
@@ -85,39 +58,33 @@ exports.build = async ({ files, entrypoint }) => {
// we need `main.go` in the same dir as the entrypoint,
// otherwise `go build` will refuse to build
const entrypointDirname = path.dirname(downloadedFiles[entrypoint].fsPath);
const entrypointDirname = dirname(downloadedFiles[entrypoint].fsPath);
// Go doesn't like to build files in different directories,
// so now we place `main.go` together with the user code
await writeFile(path.join(entrypointDirname, mainGoFileName), mainGoContents);
await writeFile(join(entrypointDirname, mainGoFileName), mainGoContents);
console.log('installing dependencies');
// `go get` will look at `*.go` (note we set `cwd`), parse
// the `import`s and download any packages that aren't part of the stdlib
const go = await createGo(goPath, process.platform, process.arch, {
cwd: entrypointDirname,
});
// `go get` will look at `*.go` (note we set `cwd`), parse the `import`s
// and download any packages that aren't part of the stdlib
try {
await execa(goBin, ['get'], {
env: goEnv,
cwd: entrypointDirname,
stdio: 'inherit',
});
await go.get();
} catch (err) {
console.log('failed to `go get`');
throw err;
}
console.log('running go build...');
console.log('Running `go build`...');
const destPath = join(outDir, 'handler');
try {
await execa(
goBin,
[
'build',
'-o',
path.join(outDir, 'handler'),
path.join(entrypointDirname, mainGoFileName),
downloadedFiles[entrypoint].fsPath,
],
{ env: goEnv, cwd: entrypointDirname, stdio: 'inherit' },
);
const src = [
join(entrypointDirname, mainGoFileName),
downloadedFiles[entrypoint].fsPath,
];
await go.build({ src, dest: destPath });
} catch (err) {
console.log('failed to `go build`');
throw err;
@@ -133,4 +100,6 @@ exports.build = async ({ files, entrypoint }) => {
return {
[entrypoint]: lambda,
};
};
}
module.exports = { config, build };

View File

@@ -1,8 +1,8 @@
package main
import (
now "../../utils/go/bridge"
"net/http"
now "github.com/zeit/now-builders/utils/go/bridge"
)
func main() {

View File

@@ -1,6 +1,6 @@
{
"name": "@now/go",
"version": "0.2.13-canary.0",
"version": "0.2.13-canary.1",
"license": "MIT",
"repository": {
"type": "git",
@@ -8,25 +8,19 @@
"directory": "packages/now-go"
},
"scripts": {
"test": "best -I test/*.js",
"prepublish": "./build.sh"
"postinstall": "node ./util/install"
},
"files": [
"bin",
"download-go-bin.js",
"index.js",
"main.go"
"*.js",
"main.go",
"util"
],
"dependencies": {
"debug": "^4.1.1",
"execa": "^1.0.0",
"fs-extra": "^7.0.0",
"lambda-git": "^0.1.2",
"mkdirp-promise": "5.0.1",
"node-fetch": "^2.2.1",
"tar": "4.4.6"
},
"devDependencies": {
"@zeit/best": "0.4.3",
"rmfr": "2.0.0"
}
}

View File

@@ -0,0 +1,18 @@
const { join } = require('path');
const { downloadGo } = require('../go-helpers');
async function main() {
// First download the `go` binary for this platform/arch.
const go = await downloadGo();
// Build the `get-exported-function-name` helper program.
// `go get` is not necessary because the program has no external deps.
const src = join(__dirname, 'get-exported-function-name.go');
const dest = join(__dirname, '../get-exported-function-name');
await go.build({ src, dest });
}
main().catch((err) => {
console.error(err);
process.exit(1);
});

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "@now/lambda",
"version": "0.4.10-canary.0",
"version": "0.4.10-canary.1",
"license": "MIT",
"repository": {
"type": "git",

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;

View File

@@ -1,6 +1,6 @@
{
"name": "@now/md",
"version": "0.4.10-canary.0",
"version": "0.4.10-canary.1",
"license": "MIT",
"repository": {
"type": "git",

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;

View File

@@ -1,6 +1,6 @@
{
"name": "@now/mdx-deck",
"version": "0.4.19-canary.0",
"version": "0.4.19-canary.1",
"license": "MIT",
"repository": {
"type": "git",

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;

View File

@@ -9,15 +9,14 @@ const {
runPackageJsonScript,
} = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
const fs = require('fs-extra');
const semver = require('semver');
const nextLegacyVersions = require('./legacy-versions');
const {
excludeFiles,
validateEntrypoint,
includeOnlyEntryDirectory,
moveEntryDirectoryToRoot,
normalizePackageJson,
excludeStaticDirectory,
onlyStaticDirectory,
} = require('./utils');
@@ -33,15 +32,17 @@ const {
/**
* Read package.json from files
* @param {DownloadedFiles} files
* @param {string} entryPath
*/
async function readPackageJson(files) {
if (!files['package.json']) {
async function readPackageJson(entryPath) {
const packagePath = path.join(entryPath, 'package.json');
try {
return JSON.parse(await readFile(packagePath, 'utf8'));
} catch (err) {
console.log('package.json not found in entry');
return {};
}
const packageJsonPath = files['package.json'].fsPath;
return JSON.parse(await readFile(packageJsonPath, 'utf8'));
}
/**
@@ -68,6 +69,36 @@ async function writeNpmRc(workPath, token) {
);
}
function getNextVersion(packageJson) {
let nextVersion;
if (packageJson.dependencies && packageJson.dependencies.next) {
nextVersion = packageJson.dependencies.next;
} else if (packageJson.devDependencies && packageJson.devDependencies.next) {
nextVersion = packageJson.devDependencies.next;
}
return nextVersion;
}
function isLegacyNext(nextVersion) {
// If version is using the dist-tag instead of a version range
if (nextVersion === 'canary' || nextVersion === 'latest') {
return false;
}
// If the version is an exact match with the legacy versions
if (nextLegacyVersions.indexOf(nextVersion) !== -1) {
return true;
}
const maxSatisfying = semver.maxSatisfying(nextLegacyVersions, nextVersion);
// When the version can't be matched with legacy versions, so it must be a newer version
if (maxSatisfying === null) {
return false;
}
return true;
}
exports.config = {
maxLambdaSize: '5mb',
};
@@ -81,65 +112,31 @@ exports.build = async ({ files, workPath, entrypoint }) => {
console.log('downloading user files...');
const entryDirectory = path.dirname(entrypoint);
const filesOnlyEntryDirectory = includeOnlyEntryDirectory(
files,
entryDirectory,
);
const filesWithEntryDirectoryRoot = moveEntryDirectoryToRoot(
filesOnlyEntryDirectory,
entryDirectory,
);
const filesWithoutStaticDirectory = excludeStaticDirectory(
filesWithEntryDirectoryRoot,
);
const downloadedFiles = await download(filesWithoutStaticDirectory, workPath);
await download(files, workPath);
const entryPath = path.join(workPath, entryDirectory);
const pkg = await readPackageJson(downloadedFiles);
let nextVersion;
if (pkg.dependencies && pkg.dependencies.next) {
nextVersion = pkg.dependencies.next;
} else if (pkg.devDependencies && pkg.devDependencies.next) {
nextVersion = pkg.devDependencies.next;
}
const pkg = await readPackageJson(entryPath);
const nextVersion = getNextVersion(pkg);
if (!nextVersion) {
throw new Error(
'No Next.js version could be detected in "package.json". Make sure `"next"` is installed in "dependencies" or "devDependencies"',
);
}
const isLegacy = (() => {
// If version is using the dist-tag instead of a version range
if (nextVersion === 'canary' || nextVersion === 'latest') {
return false;
}
// If the version is an exact match with the legacy versions
if (nextLegacyVersions.indexOf(nextVersion) !== -1) {
return true;
}
const maxSatisfying = semver.maxSatisfying(nextLegacyVersions, nextVersion);
// When the version can't be matched with legacy versions, so it must be a newer version
if (maxSatisfying === null) {
return false;
}
return true;
})();
const isLegacy = isLegacyNext(nextVersion);
console.log(`MODE: ${isLegacy ? 'legacy' : 'serverless'}`);
if (isLegacy) {
try {
await unlink(path.join(workPath, 'yarn.lock'));
await unlink(path.join(entryPath, 'yarn.lock'));
} catch (err) {
console.log('no yarn.lock removed');
}
try {
await unlink(path.join(workPath, 'package-lock.json'));
await unlink(path.join(entryPath, 'package-lock.json'));
} catch (err) {
console.log('no package-lock.json removed');
}
@@ -151,7 +148,7 @@ exports.build = async ({ files, workPath, entrypoint }) => {
console.log('normalizing package.json');
const packageJson = normalizePackageJson(pkg);
console.log('normalized package.json result: ', packageJson);
await writePackageJson(workPath, packageJson);
await writePackageJson(entryPath, packageJson);
} else if (!pkg.scripts || !pkg.scripts['now-build']) {
console.warn(
'WARNING: "now-build" script not found. Adding \'"now-build": "next build"\' to "package.json" automatically',
@@ -161,38 +158,38 @@ exports.build = async ({ files, workPath, entrypoint }) => {
...(pkg.scripts || {}),
};
console.log('normalized package.json result: ', pkg);
await writePackageJson(workPath, pkg);
await writePackageJson(entryPath, pkg);
}
if (process.env.NPM_AUTH_TOKEN) {
console.log('found NPM_AUTH_TOKEN in environment, creating .npmrc');
await writeNpmRc(workPath, process.env.NPM_AUTH_TOKEN);
await writeNpmRc(entryPath, process.env.NPM_AUTH_TOKEN);
}
console.log('installing dependencies...');
await runNpmInstall(workPath, ['--prefer-offline']);
await runNpmInstall(entryPath, ['--prefer-offline']);
console.log('running user script...');
await runPackageJsonScript(workPath, 'now-build');
await runPackageJsonScript(entryPath, 'now-build');
if (isLegacy) {
console.log('running npm install --production...');
await runNpmInstall(workPath, ['--prefer-offline', '--production']);
await runNpmInstall(entryPath, ['--prefer-offline', '--production']);
}
if (process.env.NPM_AUTH_TOKEN) {
await unlink(path.join(workPath, '.npmrc'));
await unlink(path.join(entryPath, '.npmrc'));
}
const lambdas = {};
if (isLegacy) {
const filesAfterBuild = await glob('**', workPath);
const filesAfterBuild = await glob('**', entryPath);
console.log('preparing lambda files...');
let buildId;
try {
buildId = await readFile(
path.join(workPath, '.next', 'BUILD_ID'),
path.join(entryPath, '.next', 'BUILD_ID'),
'utf8',
);
} catch (err) {
@@ -201,10 +198,10 @@ exports.build = async ({ files, workPath, entrypoint }) => {
);
throw new Error('Missing BUILD_ID');
}
const dotNextRootFiles = await glob('.next/*', workPath);
const dotNextServerRootFiles = await glob('.next/server/*', workPath);
const dotNextRootFiles = await glob('.next/*', entryPath);
const dotNextServerRootFiles = await glob('.next/server/*', entryPath);
const nodeModules = excludeFiles(
await glob('node_modules/**', workPath),
await glob('node_modules/**', entryPath),
file => file.startsWith('node_modules/.cache'),
);
const launcherFiles = {
@@ -221,7 +218,7 @@ exports.build = async ({ files, workPath, entrypoint }) => {
}
const pages = await glob(
'**/*.js',
path.join(workPath, '.next', 'server', 'static', buildId, 'pages'),
path.join(entryPath, '.next', 'server', 'static', buildId, 'pages'),
);
const launcherPath = path.join(__dirname, 'legacy-launcher.js');
const launcherData = await readFile(launcherPath, 'utf8');
@@ -277,7 +274,7 @@ exports.build = async ({ files, workPath, entrypoint }) => {
};
const pages = await glob(
'**/*.js',
path.join(workPath, '.next', 'serverless', 'pages'),
path.join(entryPath, '.next', 'serverless', 'pages'),
);
const pageKeys = Object.keys(pages);
@@ -288,6 +285,18 @@ exports.build = async ({ files, workPath, entrypoint }) => {
);
}
// An optional assets folder that is placed alongside every page entrypoint
const assets = await glob(
'assets/**',
path.join(entryPath, '.next', 'serverless'),
);
const assetKeys = Object.keys(assets);
if (assetKeys.length > 0) {
console.log('detected assets to be bundled with lambda:');
assetKeys.forEach(assetFile => console.log(`\t${assetFile}`));
}
await Promise.all(
pageKeys.map(async (page) => {
// These default pages don't have to be handled as they'd always 404
@@ -301,6 +310,7 @@ exports.build = async ({ files, workPath, entrypoint }) => {
lambdas[path.join(entryDirectory, pathname)] = await createLambda({
files: {
...launcherFiles,
...assets,
'page.js': pages[page],
},
handler: 'now__launcher.launcher',
@@ -313,17 +323,21 @@ exports.build = async ({ files, workPath, entrypoint }) => {
const nextStaticFiles = await glob(
'**',
path.join(workPath, '.next', 'static'),
path.join(entryPath, '.next', 'static'),
);
const staticFiles = Object.keys(nextStaticFiles).reduce(
(mappedFiles, file) => ({
...mappedFiles,
[path.join(entryDirectory, `_next/static/${file}`)]: nextStaticFiles[file],
[path.join(entryDirectory, `_next/static/${file}`)]: nextStaticFiles[
file
],
}),
{},
);
const nextStaticDirectory = onlyStaticDirectory(filesWithEntryDirectoryRoot);
const nextStaticDirectory = onlyStaticDirectory(
includeOnlyEntryDirectory(files, entryDirectory),
);
const staticDirectoryFiles = Object.keys(nextStaticDirectory).reduce(
(mappedFiles, file) => ({
...mappedFiles,
@@ -334,3 +348,39 @@ exports.build = async ({ files, workPath, entrypoint }) => {
return { ...lambdas, ...staticFiles, ...staticDirectoryFiles };
};
exports.prepareCache = async ({ cachePath, workPath, entrypoint }) => {
console.log('preparing cache ...');
const entryDirectory = path.dirname(entrypoint);
const entryPath = path.join(workPath, entryDirectory);
const cacheEntryPath = path.join(cachePath, entryDirectory);
const pkg = await readPackageJson(entryPath);
const nextVersion = getNextVersion(pkg);
const isLegacy = isLegacyNext(nextVersion);
if (isLegacy) {
// skip caching legacy mode (swapping deps between all and production can get bug-prone)
return {};
}
console.log('clearing old cache ...');
fs.removeSync(cacheEntryPath);
fs.mkdirpSync(cacheEntryPath);
console.log('copying build files for cache ...');
fs.renameSync(entryPath, cacheEntryPath);
console.log('producing cache file manifest ...');
const cacheEntrypoint = path.relative(cachePath, cacheEntryPath);
return {
...(await glob(
path.join(cacheEntrypoint, 'node_modules/{**,!.*,.yarn*}'),
cachePath,
)),
...(await glob(path.join(cacheEntrypoint, 'package-lock.json'), cachePath)),
...(await glob(path.join(cacheEntrypoint, 'yarn.lock'), cachePath)),
};
};

View File

@@ -4,10 +4,8 @@ const { Server } = require('http');
const { Bridge } = require('./now__bridge.js');
const page = require('./page.js');
const bridge = new Bridge();
bridge.port = 3000;
const server = new Server(page.render);
server.listen(bridge.port);
const bridge = new Bridge(server);
bridge.listen();
exports.launcher = bridge.launcher;

View File

@@ -3,9 +3,6 @@ const next = require('next-server');
const url = require('url');
const { Bridge } = require('./now__bridge.js');
const bridge = new Bridge();
bridge.port = 3000;
process.env.NODE_ENV = 'production';
const app = next({});
@@ -14,6 +11,8 @@ const server = new Server((req, res) => {
const parsedUrl = url.parse(req.url, true);
app.render(req, res, 'PATHNAME_PLACEHOLDER', parsedUrl.query, parsedUrl);
});
server.listen(bridge.port);
const bridge = new Bridge(server);
bridge.listen();
exports.launcher = bridge.launcher;

View File

@@ -1,6 +1,6 @@
{
"name": "@now/next",
"version": "0.0.85-canary.1",
"version": "0.0.85-canary.8",
"license": "MIT",
"repository": {
"type": "git",
@@ -8,8 +8,9 @@
"directory": "packages/now-next"
},
"dependencies": {
"@now/node-bridge": "0.1.4",
"@now/node-bridge": "1.0.0-canary.2",
"execa": "^1.0.0",
"fs-extra": "^7.0.0",
"fs.promised": "^3.0.0",
"semver": "^5.6.0"
}

View File

@@ -1,5 +1,3 @@
const rename = require('@now/build-utils/fs/rename.js'); // eslint-disable-line import/no-extraneous-dependencies
/** @typedef { import('@now/build-utils/file-ref') } FileRef */
/** @typedef { import('@now/build-utils/file-fs-ref') } FileFsRef */
/** @typedef {{[filePath: string]: FileRef|FileFsRef}} Files */
@@ -64,24 +62,6 @@ function includeOnlyEntryDirectory(files, entryDirectory) {
return excludeFiles(files, matcher);
}
/**
* Moves all files under the entry directory to the root directory
* @param {Files} files
* @param {string} entryDirectory
* @returns {Files}
*/
function moveEntryDirectoryToRoot(files, entryDirectory) {
if (entryDirectory === '.') {
return files;
}
function delegate(filePath) {
return filePath.replace(new RegExp(`^${entryDirectory}/`), '');
}
return rename(files, delegate);
}
/**
* Exclude package manager lockfiles from files
* @param {Files} files
@@ -98,19 +78,6 @@ function excludeLockFiles(files) {
return files;
}
/**
* Exclude the static directory from files
* @param {Files} files
* @returns {Files}
*/
function excludeStaticDirectory(files) {
function matcher(filePath) {
return filePath.startsWith('static');
}
return excludeFiles(files, matcher);
}
/**
* Exclude the static directory from files
* @param {Files} files
@@ -173,9 +140,7 @@ module.exports = {
excludeFiles,
validateEntrypoint,
includeOnlyEntryDirectory,
moveEntryDirectoryToRoot,
excludeLockFiles,
normalizePackageJson,
excludeStaticDirectory,
onlyStaticDirectory,
};

View File

@@ -0,0 +1,24 @@
{
"extends": ["prettier", "airbnb-base"],
"rules": {
"no-console": 0,
"import/no-unresolved": 0,
"import/no-dynamic-require": 0,
"global-require": 0
},
"overrides": [
{
"files": ["test/**"],
"rules": {
"import/no-extraneous-dependencies": 0
},
"globals": {
"describe": true,
"it": true,
"test": true,
"expect": true
}
}
]
}

1
packages/now-node-bridge/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
/bridge.*

View File

@@ -1,110 +0,0 @@
const http = require('http');
function normalizeEvent(event) {
let isApiGateway = true;
if (event.Action === 'Invoke') {
isApiGateway = false;
const invokeEvent = JSON.parse(event.body);
const {
method, path, headers, encoding,
} = invokeEvent;
let { body } = invokeEvent;
if (body) {
if (encoding === 'base64') {
body = Buffer.from(body, encoding);
} else if (encoding === undefined) {
body = Buffer.from(body);
} else {
throw new Error(`Unsupported encoding: ${encoding}`);
}
}
return {
isApiGateway, method, path, headers, body,
};
}
const {
httpMethod: method, path, headers, body,
} = event;
return {
isApiGateway, method, path, headers, body,
};
}
class Bridge {
constructor() {
this.launcher = this.launcher.bind(this);
}
launcher(event) {
// eslint-disable-next-line consistent-return
return new Promise((resolve, reject) => {
if (this.userError) {
console.error('Error while initializing entrypoint:', this.userError);
return resolve({ statusCode: 500, body: '' });
}
if (!this.port) {
return resolve({ statusCode: 504, body: '' });
}
const {
isApiGateway, method, path, headers, body,
} = normalizeEvent(event);
const opts = {
hostname: '127.0.0.1',
port: this.port,
path,
method,
headers,
};
const req = http.request(opts, (res) => {
const response = res;
const respBodyChunks = [];
response.on('data', chunk => respBodyChunks.push(Buffer.from(chunk)));
response.on('error', reject);
response.on('end', () => {
const bodyBuffer = Buffer.concat(respBodyChunks);
delete response.headers.connection;
if (isApiGateway) {
delete response.headers['content-length'];
} else
if (response.headers['content-length']) {
response.headers['content-length'] = bodyBuffer.length;
}
resolve({
statusCode: response.statusCode,
headers: response.headers,
body: bodyBuffer.toString('base64'),
encoding: 'base64',
});
});
});
req.on('error', (error) => {
setTimeout(() => {
// this lets express print the true error of why the connection was closed.
// it is probably 'Cannot set headers after they are sent to the client'
reject(error);
}, 2);
});
if (body) req.write(body);
req.end();
});
}
}
module.exports = {
Bridge,
};

View File

@@ -1,10 +1,26 @@
{
"name": "@now/node-bridge",
"version": "0.1.11-canary.0",
"version": "1.0.0-canary.2",
"license": "MIT",
"main": "./index.js",
"repository": {
"type": "git",
"url": "https://github.com/zeit/now-builders.git",
"directory": "packages/now-node-bridge"
},
"files": [
"bridge.*",
"index.js"
],
"scripts": {
"build": "tsc",
"test": "npm run build && jest",
"prepublish": "npm run build"
},
"devDependencies": {
"@types/aws-lambda": "8.10.19",
"@types/node": "11.9.4",
"jest": "24.1.0",
"typescript": "3.3.3"
}
}

View File

@@ -0,0 +1,183 @@
import { AddressInfo } from 'net';
import { APIGatewayProxyEvent } from 'aws-lambda';
import {
Server,
IncomingHttpHeaders,
OutgoingHttpHeaders,
request
} from 'http';
interface NowProxyEvent {
Action: string;
body: string;
}
export interface NowProxyRequest {
isApiGateway?: boolean;
method: string;
path: string;
headers: IncomingHttpHeaders;
body: Buffer;
}
export interface NowProxyResponse {
statusCode: number;
headers: OutgoingHttpHeaders;
body: string;
encoding: string;
}
function normalizeNowProxyEvent(event: NowProxyEvent): NowProxyRequest {
let bodyBuffer: Buffer | null;
const { method, path, headers, encoding, body } = JSON.parse(event.body);
if (body) {
if (encoding === 'base64') {
bodyBuffer = Buffer.from(body, encoding);
} else if (encoding === undefined) {
bodyBuffer = Buffer.from(body);
} else {
throw new Error(`Unsupported encoding: ${encoding}`);
}
} else {
bodyBuffer = Buffer.alloc(0);
}
return { isApiGateway: false, method, path, headers, body: bodyBuffer };
}
function normalizeAPIGatewayProxyEvent(
event: APIGatewayProxyEvent
): NowProxyRequest {
let bodyBuffer: Buffer | null;
const { httpMethod: method, path, headers, body } = event;
if (body) {
if (event.isBase64Encoded) {
bodyBuffer = Buffer.from(body, 'base64');
} else {
bodyBuffer = Buffer.from(body);
}
} else {
bodyBuffer = Buffer.alloc(0);
}
return { isApiGateway: true, method, path, headers, body: bodyBuffer };
}
function normalizeEvent(
event: NowProxyEvent | APIGatewayProxyEvent
): NowProxyRequest {
if ('Action' in event) {
if (event.Action === 'Invoke') {
return normalizeNowProxyEvent(event);
} else {
throw new Error(`Unexpected event.Action: ${event.Action}`);
}
} else {
return normalizeAPIGatewayProxyEvent(event);
}
}
export class Bridge {
private server: Server | null;
private listening: Promise<AddressInfo>;
private resolveListening: (info: AddressInfo) => void;
constructor(server?: Server) {
this.server = null;
if (server) {
this.setServer(server);
}
this.launcher = this.launcher.bind(this);
// This is just to appease TypeScript strict mode, since it doesn't
// understand that the Promise constructor is synchronous
this.resolveListening = (info: AddressInfo) => {};
this.listening = new Promise(resolve => {
this.resolveListening = resolve;
});
}
setServer(server: Server) {
this.server = server;
server.once('listening', () => {
const addr = server.address();
if (typeof addr === 'string') {
throw new Error(`Unexpected string for \`server.address()\`: ${addr}`);
} else if (!addr) {
throw new Error('`server.address()` returned `null`');
} else {
this.resolveListening(addr);
}
});
}
listen() {
if (!this.server) {
throw new Error('Server has not been set!');
}
return this.server.listen({
host: '127.0.0.1',
port: 0
});
}
async launcher(
event: NowProxyEvent | APIGatewayProxyEvent
): Promise<NowProxyResponse> {
const { port } = await this.listening;
const { isApiGateway, method, path, headers, body } = normalizeEvent(
event
);
const opts = {
hostname: '127.0.0.1',
port,
path,
method,
headers
};
// eslint-disable-next-line consistent-return
return new Promise((resolve, reject) => {
const req = request(opts, res => {
const response = res;
const respBodyChunks: Buffer[] = [];
response.on('data', chunk => respBodyChunks.push(Buffer.from(chunk)));
response.on('error', reject);
response.on('end', () => {
const bodyBuffer = Buffer.concat(respBodyChunks);
delete response.headers.connection;
if (isApiGateway) {
delete response.headers['content-length'];
} else if (response.headers['content-length']) {
response.headers['content-length'] = String(bodyBuffer.length);
}
resolve({
statusCode: response.statusCode || 200,
headers: response.headers,
body: bodyBuffer.toString('base64'),
encoding: 'base64'
});
});
});
req.on('error', error => {
setTimeout(() => {
// this lets express print the true error of why the connection was closed.
// it is probably 'Cannot set headers after they are sent to the client'
reject(error);
}, 2);
});
if (body) req.write(body);
req.end();
});
}
}

View File

@@ -0,0 +1,71 @@
const assert = require('assert');
const { Server } = require('http');
const { Bridge } = require('../bridge');
test('port binding', async () => {
const server = new Server();
const bridge = new Bridge(server);
bridge.listen();
// Test port binding
const info = await bridge.listening;
assert.equal(info.address, '127.0.0.1');
assert.equal(typeof info.port, 'number');
server.close();
});
test('`APIGatewayProxyEvent` normalizing', async () => {
const server = new Server((req, res) => res.end(
JSON.stringify({
method: req.method,
path: req.url,
headers: req.headers,
}),
));
const bridge = new Bridge(server);
bridge.listen();
const result = await bridge.launcher({
httpMethod: 'GET',
headers: { foo: 'bar' },
path: '/apigateway',
body: null,
});
assert.equal(result.encoding, 'base64');
assert.equal(result.statusCode, 200);
const body = JSON.parse(Buffer.from(result.body, 'base64').toString());
assert.equal(body.method, 'GET');
assert.equal(body.path, '/apigateway');
assert.equal(body.headers.foo, 'bar');
server.close();
});
test('`NowProxyEvent` normalizing', async () => {
const server = new Server((req, res) => res.end(
JSON.stringify({
method: req.method,
path: req.url,
headers: req.headers,
}),
));
const bridge = new Bridge(server);
bridge.listen();
const result = await bridge.launcher({
Action: 'Invoke',
body: JSON.stringify({
method: 'POST',
headers: { foo: 'baz' },
path: '/nowproxy',
body: 'body=1',
}),
});
assert.equal(result.encoding, 'base64');
assert.equal(result.statusCode, 200);
const body = JSON.parse(Buffer.from(result.body, 'base64').toString());
assert.equal(body.method, 'POST');
assert.equal(body.path, '/nowproxy');
assert.equal(body.headers.foo, 'baz');
server.close();
});

View File

@@ -0,0 +1,16 @@
{
"compilerOptions": {
"target": "es6",
"module": "commonjs",
"outDir": ".",
"strict": true,
"sourceMap": true,
"declaration": true
},
"include": [
"src/**/*"
],
"exclude": [
"node_modules"
]
}

View File

@@ -48,7 +48,7 @@ async function downloadInstallAndBundle(
data: JSON.stringify({
license: 'UNLICENSED',
dependencies: {
'@zeit/ncc': '0.13.2',
'@zeit/ncc': '0.15.2',
},
}),
}),
@@ -64,7 +64,7 @@ async function downloadInstallAndBundle(
async function compile(workNccPath, downloadedFiles, entrypoint) {
const input = downloadedFiles[entrypoint].fsPath;
const ncc = require(path.join(workNccPath, 'node_modules/@zeit/ncc'));
const { code, assets } = await ncc(input);
const { code, assets } = await ncc(input, { sourceMap: true });
const preparedFiles = {};
const blob = new FileBlob({ data: code });

View File

@@ -4,22 +4,16 @@ const { Bridge } = require('./bridge.js');
const bridge = new Bridge();
const saveListen = Server.prototype.listen;
Server.prototype.listen = function listen(...args) {
this.on('listening', function listening() {
bridge.port = this.address().port;
});
saveListen.apply(this, args);
Server.prototype.listen = function listen() {
bridge.setServer(this);
Server.prototype.listen = saveListen;
return bridge.listen();
};
try {
if (!process.env.NODE_ENV) {
process.env.NODE_ENV = 'production';
}
// PLACEHOLDER
} catch (error) {
console.error(error);
bridge.userError = error;
if (!process.env.NODE_ENV) {
process.env.NODE_ENV = 'production';
}
// PLACEHOLDER
exports.launcher = bridge.launcher;

View File

@@ -1,6 +1,6 @@
{
"name": "@now/node-server",
"version": "0.4.27-canary.2",
"version": "0.5.0-canary.3",
"license": "MIT",
"repository": {
"type": "git",
@@ -8,7 +8,7 @@
"directory": "packages/now-node-server"
},
"dependencies": {
"@now/node-bridge": "^0.1.11-canary.0",
"@now/node-bridge": "1.0.0-canary.2",
"fs-extra": "7.0.1"
},
"scripts": {

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;

2
packages/now-node/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
/dist
/src/bridge.d.ts

View File

@@ -1 +0,0 @@
/test

12
packages/now-node/build.sh Executable file
View File

@@ -0,0 +1,12 @@
#!/bin/bash
set -euo pipefail
bridge_entrypoint="$(node -p 'require.resolve("@now/node-bridge")')"
bridge_defs="$(dirname "$bridge_entrypoint")/bridge.d.ts"
if [ ! -e "$bridge_defs" ]; then
yarn install
fi
cp -v "$bridge_defs" src
tsc

View File

@@ -1,22 +0,0 @@
const { Server } = require('http');
const { Bridge } = require('./bridge.js');
const bridge = new Bridge();
bridge.port = 3000;
let listener;
try {
if (!process.env.NODE_ENV) {
process.env.NODE_ENV = 'production';
}
// PLACEHOLDER
} catch (error) {
console.error(error);
bridge.userError = error;
}
const server = new Server(listener);
server.listen(bridge.port);
exports.launcher = bridge.launcher;

View File

@@ -1,17 +1,27 @@
{
"name": "@now/node",
"version": "0.4.29-canary.2",
"version": "0.5.0-canary.5",
"license": "MIT",
"main": "./dist/index",
"repository": {
"type": "git",
"url": "https://github.com/zeit/now-builders.git",
"directory": "packages/now-node"
},
"dependencies": {
"@now/node-bridge": "^0.1.11-canary.0",
"@now/node-bridge": "1.0.0-canary.2",
"fs-extra": "7.0.1"
},
"scripts": {
"build": "./build.sh",
"test": "jest"
},
"files": [
"dist"
],
"devDependencies": {
"@types/node": "11.9.4",
"jest": "24.1.0",
"typescript": "3.3.3"
}
}

View File

@@ -1,14 +1,14 @@
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
const fs = require('fs-extra');
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
const path = require('path');
const {
import { join, dirname } from 'path';
import { remove, readFile } from 'fs-extra';
import * as glob from '@now/build-utils/fs/glob.js';
import * as download from '@now/build-utils/fs/download.js';
import * as FileBlob from '@now/build-utils/file-blob.js';
import * as FileFsRef from '@now/build-utils/file-fs-ref.js';
import { createLambda } from '@now/build-utils/lambda.js';
import {
runNpmInstall,
runPackageJsonScript,
} = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
runPackageJsonScript
} from '@now/build-utils/fs/run-user-scripts.js';
/** @typedef { import('@now/build-utils/file-ref') } FileRef */
/** @typedef {{[filePath: string]: FileRef}} Files */
@@ -27,16 +27,16 @@ const {
*/
async function downloadInstallAndBundle(
{ files, entrypoint, workPath },
{ npmArguments = [] } = {},
{ npmArguments = [] } = {}
) {
const userPath = path.join(workPath, 'user');
const nccPath = path.join(workPath, 'ncc');
const userPath = join(workPath, 'user');
const nccPath = join(workPath, 'ncc');
console.log('downloading user files...');
const downloadedFiles = await download(files, userPath);
console.log("installing dependencies for user's code...");
const entrypointFsDirname = path.join(userPath, path.dirname(entrypoint));
const entrypointFsDirname = join(userPath, dirname(entrypoint));
await runNpmInstall(entrypointFsDirname, npmArguments);
console.log('writing ncc package.json...');
@@ -46,12 +46,12 @@ async function downloadInstallAndBundle(
data: JSON.stringify({
license: 'UNLICENSED',
dependencies: {
'@zeit/ncc': '0.13.2',
},
}),
}),
'@zeit/ncc': '0.15.2',
}
})
})
},
nccPath,
nccPath
);
console.log('installing dependencies for ncc...');
@@ -59,43 +59,41 @@ async function downloadInstallAndBundle(
return [downloadedFiles, nccPath, entrypointFsDirname];
}
async function compile(workNccPath, downloadedFiles, entrypoint) {
async function compile(workNccPath: string, downloadedFiles, entrypoint: string) {
const input = downloadedFiles[entrypoint].fsPath;
const ncc = require(path.join(workNccPath, 'node_modules/@zeit/ncc'));
const ncc = require(join(workNccPath, 'node_modules/@zeit/ncc'));
const { code, assets } = await ncc(input);
const preparedFiles = {};
const blob = new FileBlob({ data: code });
// move all user code to 'user' subdirectory
preparedFiles[path.join('user', entrypoint)] = blob;
preparedFiles[join('user', entrypoint)] = blob;
// eslint-disable-next-line no-restricted-syntax
for (const assetName of Object.keys(assets)) {
const { source: data, permissions: mode } = assets[assetName];
const blob2 = new FileBlob({ data, mode });
preparedFiles[
path.join('user', path.dirname(entrypoint), assetName)
] = blob2;
preparedFiles[join('user', dirname(entrypoint), assetName)] = blob2;
}
return preparedFiles;
}
exports.config = {
maxLambdaSize: '5mb',
export const config = {
maxLambdaSize: '5mb'
};
/**
* @param {BuildParamsType} buildParams
* @returns {Promise<Files>}
*/
exports.build = async ({ files, entrypoint, workPath }) => {
export async function build({ files, entrypoint, workPath }) {
const [
downloadedFiles,
workNccPath,
entrypointFsDirname,
entrypointFsDirname
] = await downloadInstallAndBundle(
{ files, entrypoint, workPath },
{ npmArguments: ['--prefer-offline'] },
{ npmArguments: ['--prefer-offline'] }
);
console.log('running user script...');
@@ -103,36 +101,34 @@ exports.build = async ({ files, entrypoint, workPath }) => {
console.log('compiling entrypoint with ncc...');
const preparedFiles = await compile(workNccPath, downloadedFiles, entrypoint);
const launcherPath = path.join(__dirname, 'launcher.js');
let launcherData = await fs.readFile(launcherPath, 'utf8');
const launcherPath = join(__dirname, 'launcher.js');
let launcherData = await readFile(launcherPath, 'utf8');
launcherData = launcherData.replace(
'// PLACEHOLDER',
[
'process.chdir("./user");',
`listener = require("./${path.join('user', entrypoint)}");`,
'if (listener.default) listener = listener.default;',
].join(' '),
`listener = require("./${join('user', entrypoint)}");`,
'if (listener.default) listener = listener.default;'
].join(' ')
);
const launcherFiles = {
'launcher.js': new FileBlob({ data: launcherData }),
'bridge.js': new FileFsRef({ fsPath: require('@now/node-bridge') }),
'bridge.js': new FileFsRef({ fsPath: require('@now/node-bridge') })
};
const lambda = await createLambda({
files: { ...preparedFiles, ...launcherFiles },
handler: 'launcher.launcher',
runtime: 'nodejs8.10',
runtime: 'nodejs8.10'
});
return { [entrypoint]: lambda };
};
}
exports.prepareCache = async ({
files, entrypoint, workPath, cachePath,
}) => {
await fs.remove(workPath);
export async function prepareCache({ files, entrypoint, workPath, cachePath }) {
await remove(workPath);
await downloadInstallAndBundle({ files, entrypoint, workPath: cachePath });
return {
@@ -141,6 +137,6 @@ exports.prepareCache = async ({
...(await glob('user/yarn.lock', cachePath)),
...(await glob('ncc/node_modules/**', cachePath)),
...(await glob('ncc/package-lock.json', cachePath)),
...(await glob('ncc/yarn.lock', cachePath)),
...(await glob('ncc/yarn.lock', cachePath))
};
};
}

View File

@@ -0,0 +1,16 @@
import { Server } from 'http';
import { Bridge } from './bridge';
let listener;
if (!process.env.NODE_ENV) {
process.env.NODE_ENV = 'production';
}
// PLACEHOLDER
const server = new Server(listener);
const bridge = new Bridge(server);
bridge.listen();
exports.launcher = bridge.launcher;

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;

View File

@@ -0,0 +1,15 @@
{
"compilerOptions": {
"target": "es6",
"module": "commonjs",
"outDir": "dist",
"sourceMap": false,
"declaration": false
},
"include": [
"src/**/*"
],
"exclude": [
"node_modules"
]
}

View File

@@ -1,6 +1,6 @@
{
"name": "@now/php",
"version": "0.4.14-canary.0",
"version": "0.4.14-canary.1",
"license": "MIT",
"repository": {
"type": "git",

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;

View File

@@ -40,8 +40,16 @@ exports.build = async ({ files, entrypoint }) => {
await pipInstall(pipPath, srcDir, 'requests');
if (files['requirements.txt']) {
console.log('found "requirements.txt"');
const entryDirectory = path.dirname(entrypoint);
const requirementsTxt = path.join(entryDirectory, 'requirements.txt');
if (files[requirementsTxt]) {
console.log('found local "requirements.txt"');
const requirementsTxtPath = files[requirementsTxt].fsPath;
await pipInstall(pipPath, srcDir, '-r', requirementsTxtPath);
} else if (files['requirements.txt']) {
console.log('found global "requirements.txt"');
const requirementsTxtPath = files['requirements.txt'].fsPath;
await pipInstall(pipPath, srcDir, '-r', requirementsTxtPath);

View File

@@ -1,23 +1,35 @@
import base64
from http.server import HTTPServer
import json
import requests
from __NOW_HANDLER_FILENAME import handler
import _thread
server = HTTPServer(('', 3000), handler)
def now_handler(event, context):
_thread.start_new_thread(server.handle_request, ())
payload = json.loads(event['body'])
path = payload['path']
headers = payload['headers']
method = payload['method']
res = requests.request(method, 'http://0.0.0.0:3000' + path, headers=headers)
encoding = payload.get('encoding')
body = payload.get('body')
if (
(body is not None and len(body) > 0) and
(encoding is not None and encoding == 'base64')
):
body = base64.b64decode(body)
res = requests.request(method, 'http://0.0.0.0:3000' + path,
headers=headers, data=body, allow_redirects=False)
return {
'statusCode': res.status_code,
'headers': dict(res.headers),
'body': res.text
'body': res.text,
}

View File

@@ -1,6 +1,6 @@
{
"name": "@now/python",
"version": "0.0.41-canary.0",
"version": "0.0.41-canary.2",
"main": "index.js",
"license": "MIT",
"repository": {

View File

@@ -448,7 +448,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "now_lambda"
version = "0.1.0"
version = "0.1.2"
dependencies = [
"base64 0.10.1 (registry+https://github.com/rust-lang/crates.io-index)",
"http 0.1.15 (registry+https://github.com/rust-lang/crates.io-index)",

View File

@@ -1,8 +1,18 @@
[package]
name = "now_lambda"
version = "0.1.0"
version = "0.1.2"
authors = ["Antonio Nuno Monteiro <anmonteiro@gmail.com>"]
edition = "2018"
description = "Rust bindings for Now.sh Lambdas"
keywords = ["AWS", "Lambda", "Zeit", "Now", "Rust"]
license = "MIT"
homepage = "https://github.com/zeit/now-builders"
repository = "https://github.com/zeit/now-builders"
documentation = "https://docs.rs/now_lambda"
include = [
"src/*.rs",
"Cargo.toml"
]
[dependencies]
serde = "^1"
@@ -12,4 +22,4 @@ http = "0.1"
tokio = "^0.1"
base64 = "0.10"
log = "^0.4"
lambda_runtime = "0.2.0"
lambda_runtime = "0.2.0"

View File

@@ -1,5 +1,6 @@
const tar = require('tar');
const fetch = require('node-fetch');
const execa = require('execa');
const rustUrl = 'https://dmmcy0pwk6bqi.cloudfront.net/rust.tar.gz';
const ccUrl = 'https://dmmcy0pwk6bqi.cloudfront.net/gcc-4.8.5.tgz';
@@ -53,10 +54,31 @@ async function downloadGCC() {
});
}
async function installOpenSSL() {
console.log('installing openssl-devel...');
try {
// need to downgrade otherwise yum can't resolve the dependencies given
// a later version is already installed in the machine.
await execa(
'yum',
['downgrade', '-y', 'krb5-libs-1.14.1-27.41.amzn1.x86_64'],
{
stdio: 'inherit',
},
);
await execa('yum', ['install', '-y', 'openssl-devel'], {
stdio: 'inherit',
});
} catch (err) {
console.error('failed to `yum install -y openssl-devel`');
throw err;
}
}
module.exports = async () => {
await downloadRustToolchain();
const newEnv = await downloadGCC();
await installOpenSSL();
return newEnv;
};

View File

@@ -1,66 +1,72 @@
const fs = require('fs');
const fs = require('fs-extra');
const path = require('path');
const concat = require('concat-stream');
const execa = require('execa');
const toml = require('toml');
const rimraf = require('rimraf');
const toml = require('@iarna/toml');
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
const installRustAndGCC = require('./download-install-rust-toolchain.js');
const inferCargoBinaries = require('./inferCargoBinaries.js');
exports.config = {
maxLambdaSize: '25mb',
};
async function parseTOMLStream(stream) {
return new Promise((resolve) => {
stream.pipe(concat(data => resolve(toml.parse(data))));
});
}
exports.build = async ({ files, entrypoint, workPath }) => {
console.log('downloading files');
const downloadedFiles = await download(files, workPath);
const { PATH: toolchainPath, ...otherEnv } = await installRustAndGCC();
const { PATH, HOME } = process.env;
const rustEnv = {
...process.env,
...otherEnv,
PATH: `${path.join(HOME, '.cargo/bin')}:${toolchainPath}:${PATH}`,
};
let cargoToml;
async function inferCargoBinaries(config) {
try {
cargoToml = await parseTOMLStream(files[entrypoint].toStream());
const { stdout: manifestStr } = await execa(
'cargo',
['read-manifest'],
config,
);
const { targets } = JSON.parse(manifestStr);
return targets
.filter(({ kind }) => kind.includes('bin'))
.map(({ name }) => name);
} catch (err) {
console.error('Failed to parse TOML from entrypoint:', entrypoint);
console.error('failed to run `cargo read-manifest`');
throw err;
}
}
async function parseTOMLStream(stream) {
return toml.parse.stream(stream);
}
async function buildWholeProject({
entrypoint,
downloadedFiles,
rustEnv,
config,
}) {
const entrypointDirname = path.dirname(downloadedFiles[entrypoint].fsPath);
console.log('running `cargo build --release`...');
const { debug } = config;
console.log('running `cargo build`...');
try {
await execa('cargo', ['build', '--release'], {
await execa('cargo', ['build'].concat(debug ? [] : ['--release']), {
env: rustEnv,
cwd: entrypointDirname,
stdio: 'inherit',
});
} catch (err) {
console.error('failed to `cargo build --release`');
console.error('failed to `cargo build`');
throw err;
}
const targetPath = path.join(workPath, 'target', 'release');
const binaries = await inferCargoBinaries(
cargoToml,
path.join(workPath, 'src'),
const targetPath = path.join(
entrypointDirname,
'target',
debug ? 'debug' : 'release',
);
const binaries = await inferCargoBinaries({
env: rustEnv,
cwd: entrypointDirname,
});
const lambdas = {};
const lambdaPath = path.dirname(entrypoint);
await Promise.all(
binaries.map(async (binary) => {
const fsPath = path.join(targetPath, binary);
@@ -72,19 +78,193 @@ exports.build = async ({ files, entrypoint, workPath }) => {
runtime: 'provided',
});
lambdas[binary] = lambda;
lambdas[path.join(lambdaPath, binary)] = lambda;
}),
);
return lambdas;
};
}
exports.prepareCache = async ({ cachePath, workPath }) => {
console.log('preparing cache...');
rimraf.sync(path.join(cachePath, 'target'));
fs.renameSync(path.join(workPath, 'target'), path.join(cachePath, 'target'));
async function cargoLocateProject(config) {
try {
const { stdout: projectDescriptionStr } = await execa(
'cargo',
['locate-project'],
config,
);
const projectDescription = JSON.parse(projectDescriptionStr);
if (projectDescription != null && projectDescription.root != null) {
return projectDescription.root;
}
} catch (e) {
if (!/could not find/g.test(e.stderr)) {
console.error("Couldn't run `cargo locate-project`");
throw e;
}
}
return null;
}
async function buildSingleFile({
workPath,
entrypoint,
downloadedFiles,
rustEnv,
config,
}) {
console.log('building single file');
const launcherPath = path.join(__dirname, 'launcher.rs');
let launcherData = await fs.readFile(launcherPath, 'utf8');
const entrypointPath = downloadedFiles[entrypoint].fsPath;
const entrypointDirname = path.dirname(entrypointPath);
launcherData = launcherData.replace(
'// PLACEHOLDER',
await fs.readFile(path.join(workPath, entrypoint)),
);
// replace the entrypoint with one that includes the the imports + lambda.start
await fs.remove(entrypointPath);
await fs.writeFile(entrypointPath, launcherData);
// Find a Cargo.toml file or TODO: create one
const cargoTomlFile = await cargoLocateProject({
env: rustEnv,
cwd: entrypointDirname,
});
// TODO: we're assuming there's a Cargo.toml file. We need to create one
// otherwise
let cargoToml;
try {
cargoToml = await parseTOMLStream(fs.createReadStream(cargoTomlFile));
} catch (err) {
console.error('Failed to parse TOML from entrypoint:', entrypoint);
throw err;
}
const binName = path
.basename(entrypointPath)
.replace(path.extname(entrypointPath), '');
const { package: pkg, dependencies } = cargoToml;
// default to latest now_lambda
dependencies.now_lambda = '*';
const tomlToWrite = toml.stringify({
package: pkg,
dependencies,
bin: [
{
name: binName,
path: entrypointPath,
},
],
});
console.log('toml to write:', tomlToWrite);
// Overwrite the Cargo.toml file with one that includes the `now_lambda`
// dependency and our binary. `dependencies` is a map so we don't run the
// risk of having 2 `now_lambda`s in there.
await fs.writeFile(cargoTomlFile, tomlToWrite);
const { debug } = config;
console.log('running `cargo build`...');
try {
await execa(
'cargo',
['build', '--bin', binName].concat(debug ? [] : ['--release']),
{
env: rustEnv,
cwd: entrypointDirname,
stdio: 'inherit',
},
);
} catch (err) {
console.error('failed to `cargo build`');
throw err;
}
const bin = path.join(
path.dirname(cargoTomlFile),
'target',
debug ? 'debug' : 'release',
binName,
);
const lambda = await createLambda({
files: {
bootstrap: new FileFsRef({ mode: 0o755, fsPath: bin }),
},
handler: 'bootstrap',
runtime: 'provided',
});
return {
...(await glob('target/**', path.join(cachePath))),
[entrypoint]: lambda,
};
}
exports.build = async (m) => {
const { files, entrypoint, workPath } = m;
console.log('downloading files');
const downloadedFiles = await download(files, workPath);
const { PATH: toolchainPath, ...otherEnv } = await installRustAndGCC();
const { PATH, HOME } = process.env;
const rustEnv = {
...process.env,
...otherEnv,
PATH: `${path.join(HOME, '.cargo/bin')}:${toolchainPath}:${PATH}`,
};
const newM = Object.assign(m, { downloadedFiles, rustEnv });
if (path.extname(entrypoint) === '.toml') {
return buildWholeProject(newM);
}
return buildSingleFile(newM);
};
exports.prepareCache = async ({ cachePath, entrypoint, workPath }) => {
console.log('preparing cache...');
let targetFolderDir;
if (path.extname(entrypoint) === '.toml') {
targetFolderDir = path.dirname(path.join(workPath, entrypoint));
} else {
const { PATH, HOME } = process.env;
const rustEnv = {
...process.env,
PATH: `${path.join(HOME, '.cargo/bin')}:${PATH}`,
};
const entrypointDirname = path.dirname(path.join(workPath, entrypoint));
const cargoTomlFile = await cargoLocateProject({
env: rustEnv,
cwd: entrypointDirname,
});
if (cargoTomlFile != null) {
targetFolderDir = path.dirname(cargoTomlFile);
} else {
// `Cargo.toml` doesn't exist, in `build` we put it in the same
// path as the entrypoint.
targetFolderDir = path.dirname(path.join(workPath, entrypoint));
}
}
const cacheEntrypointDirname = path.join(
cachePath,
path.relative(workPath, targetFolderDir),
);
// Remove the target folder to avoid 'directory already exists' errors
fs.removeSync(path.join(cacheEntrypointDirname, 'target'));
fs.mkdirpSync(cacheEntrypointDirname);
// Move the target folder to the cache location
fs.renameSync(
path.join(targetFolderDir, 'target'),
path.join(cacheEntrypointDirname, 'target'),
);
return {
...(await glob('**/**', path.join(cachePath))),
};
};

View File

@@ -1,73 +0,0 @@
const fs = require('fs');
const path = require('path');
function readdir(dir) {
return new Promise((resolve, reject) => {
fs.readdir(dir, (err, files) => {
if (err != null) {
return reject(err);
}
return resolve(files);
});
});
}
function exists(p) {
return new Promise((resolve, reject) => {
fs.exists(p, (err, res) => {
if (err != null) {
return reject(err);
}
return resolve(res);
});
});
}
function stat(p) {
return new Promise((resolve, reject) => {
fs.stat(p, (err, stats) => {
if (err != null) {
return reject(err);
}
return resolve(stats);
});
});
}
async function inferCargoBinaries(cargoToml, srcDir) {
const { package: pkg, bin } = cargoToml;
const binaries = [];
const hasMain = (await readdir(srcDir)).includes('main.rs');
if (hasMain) {
binaries.push(pkg.name);
}
// From: https://doc.rust-lang.org/cargo/reference/manifest.html#the-project-layout
// Do note, however, once you add a [[bin]] section (see below), Cargo will
// no longer automatically build files located in src/bin/*.rs. Instead you
// must create a [[bin]] section for each file you want to build.
if (Array.isArray(bin)) {
bin.forEach((binary) => {
binaries.push(binary.name);
});
} else {
const binDir = path.join(srcDir, 'bin');
const filesInSrcBin = (await exists(binDir)) && (await stat(binDir)).isDirectory()
? await readdir(binDir)
: [];
filesInSrcBin.forEach((file) => {
if (file.endsWith('.rs')) {
binaries.push(file.slice(0, -3));
}
});
}
return binaries;
}
module.exports = inferCargoBinaries;

View File

@@ -0,0 +1,8 @@
use now_lambda::lambda;
use std::error::Error;
// PLACEHOLDER
fn main() -> Result<(), Box<dyn Error>> {
Ok(lambda!(handler))
}

View File

@@ -1,6 +1,6 @@
{
"name": "@now/rust",
"version": "0.0.2-canary.0",
"version": "0.0.3-canary.2",
"license": "MIT",
"repository": {
"type": "git",
@@ -10,14 +10,13 @@
"files": [
"index.js",
"download-install-rust-toolchain.js",
"inferCargoBinaries.js"
"launcher.rs"
],
"dependencies": {
"concat-stream": "^2.0.0",
"@iarna/toml": "^2.2.1",
"execa": "^1.0.0",
"fs-extra": "^7.0.1",
"node-fetch": "^2.3.0",
"rimraf": "^2.6.3",
"tar": "^4.4.8",
"toml": "^2.3.3"
"tar": "^4.4.8"
}
}

View File

@@ -1,6 +1,6 @@
//! Provides a Now Lambda oriented request and response body entity interface
use std::{borrow::Cow, ops::Deref};
use std::{borrow::Cow, ops::Deref, str};
use base64::display::Base64Display;
use serde::ser::{Error as SerError, Serialize, Serializer};
@@ -76,6 +76,12 @@ impl From<()> for Body {
}
}
impl From<Body> for () {
fn from(_: Body) -> Self {
()
}
}
impl<'a> From<&'a str> for Body {
fn from(s: &'a str) -> Self {
Body::Text(s.into())
@@ -87,6 +93,15 @@ impl From<String> for Body {
Body::Text(b)
}
}
impl From<Body> for String {
fn from(b: Body) -> String {
match b {
Body::Empty => String::from(""),
Body::Text(t) => t,
Body::Binary(b) => str::from_utf8(&b).unwrap().to_owned(),
}
}
}
impl From<Cow<'static, str>> for Body {
#[inline]
@@ -98,6 +113,13 @@ impl From<Cow<'static, str>> for Body {
}
}
impl From<Body> for Cow<'static, str> {
#[inline]
fn from(b: Body) -> Cow<'static, str> {
Cow::Owned(String::from(b))
}
}
impl From<Cow<'static, [u8]>> for Body {
#[inline]
fn from(cow: Cow<'static, [u8]>) -> Body {
@@ -108,12 +130,29 @@ impl From<Cow<'static, [u8]>> for Body {
}
}
impl From<Body> for Cow<'static, [u8]> {
#[inline]
fn from(b: Body) -> Self {
Cow::Owned(b.as_ref().to_owned())
}
}
impl From<Vec<u8>> for Body {
fn from(b: Vec<u8>) -> Self {
Body::Binary(b)
}
}
impl From<Body> for Vec<u8> {
fn from(b: Body) -> Self {
match b {
Body::Empty => "".as_bytes().to_owned(),
Body::Text(t) => t.into_bytes(),
Body::Binary(b) => b.to_owned(),
}
}
}
impl<'a> From<&'a [u8]> for Body {
fn from(b: &'a [u8]) -> Self {
Body::Binary(b.to_vec())

View File

@@ -1,3 +1,4 @@
use http;
use lambda_runtime::error::LambdaErrorExt;
use std::{error::Error, fmt};
@@ -19,13 +20,21 @@ impl fmt::Display for NowError {
write!(f, "{}", self.msg)
}
}
impl Error for NowError {}
impl From<std::num::ParseIntError> for NowError {
fn from(i: std::num::ParseIntError) -> Self {
NowError::new(&format!("{}", i))
}
}
impl From<http::Error> for NowError {
fn from(i: http::Error) -> Self {
NowError::new(&format!("{}", i))
}
}
// the value returned by the error_type function is included as the
// `errorType` in the AWS Lambda response
impl LambdaErrorExt for NowError {

View File

@@ -21,16 +21,16 @@ use crate::{
pub type Request = http::Request<Body>;
/// Functions acting as Now Lambda handlers must conform to this type.
pub trait Handler<R> {
pub trait Handler<R, B, E> {
/// Method to execute the handler function
fn run(&mut self, event: Request) -> Result<R, NowError>;
fn run(&mut self, event: http::Request<B>) -> Result<R, E>;
}
impl<Function, R> Handler<R> for Function
impl<Function, R, B, E> Handler<R, B, E> for Function
where
Function: FnMut(Request) -> Result<R, NowError>,
Function: FnMut(http::Request<B>) -> Result<R, E>,
{
fn run(&mut self, event: Request) -> Result<R, NowError> {
fn run(&mut self, event: http::Request<B>) -> Result<R, E> {
(*self)(event)
}
}
@@ -43,8 +43,10 @@ where
///
/// # Panics
/// The function panics if the Lambda environment variables are not set.
pub fn start<R>(f: impl Handler<R>, runtime: Option<TokioRuntime>)
pub fn start<R, B, E>(f: impl Handler<R, B, E>, runtime: Option<TokioRuntime>)
where
B: From<Body>,
E: Into<NowError>,
R: IntoResponse,
{
// handler requires a mutable ref
@@ -56,8 +58,10 @@ where
match parse_result {
Ok(req) => {
debug!("Deserialized Now proxy request successfully");
func.run(req.into())
let request: http::Request<Body> = req.into();
func.run(request.map(|b| b.into()))
.map(|resp| NowResponse::from(resp.into_response()))
.map_err(|e| e.into())
}
Err(e) => {
error!("Could not deserialize event body to NowRequest {}", e);

View File

@@ -1,13 +1,8 @@
use std::{borrow::Cow, fmt, mem};
use http::{self, header::HeaderValue, HeaderMap, Method, Request as HttpRequest};
use serde::{
de::{Error as DeError, MapAccess, Visitor},
Deserialize, Deserializer,
};
use serde::de::{Deserializer, Error as DeError, MapAccess, Visitor};
use serde_derive::Deserialize;
#[allow(unused_imports)]
use serde_json::Value;
use crate::body::Body;
@@ -92,18 +87,6 @@ where
deserializer.deserialize_map(HeaderVisitor)
}
/// deserializes (json) null values to their default values
// https://github.com/serde-rs/serde/issues/1098
#[allow(dead_code)]
fn nullable_default<'de, T, D>(deserializer: D) -> Result<T, D::Error>
where
D: Deserializer<'de>,
T: Default + Deserialize<'de>,
{
let opt = Option::deserialize(deserializer)?;
Ok(opt.unwrap_or_else(T::default))
}
impl<'a> From<NowRequest<'a>> for HttpRequest<Body> {
fn from(value: NowRequest<'_>) -> Self {
let NowRequest {

View File

@@ -4,10 +4,7 @@ use http::{
header::{HeaderMap, HeaderValue},
Response,
};
use serde::{
ser::{Error as SerError, SerializeMap},
Serializer,
};
use serde::ser::{Error as SerError, SerializeMap, Serializer};
use serde_derive::Serialize;
use crate::body::Body;

View File

@@ -4,10 +4,7 @@ use std::{
sync::Arc,
};
use serde::{
de::{MapAccess, Visitor},
Deserialize, Deserializer,
};
use serde::de::{Deserialize, Deserializer, MapAccess, Visitor};
/// A read-only view into a map of string data
#[derive(Default, Debug, PartialEq)]

View File

@@ -1,154 +0,0 @@
/* global afterAll, beforeAll, describe, expect, it, jest */
const fs = require('fs');
const inferCargoBinaries = require('../inferCargoBinaries');
const { exists, readdir, stat } = fs;
const isDir = fs.Stats.prototype.isDirectory;
beforeAll(() => {
fs.exists = jest.fn((p, cb) => cb(null, false));
});
afterAll(() => {
fs.readdir = readdir;
fs.stat = stat;
fs.Stats.prototype.isDirectory = isDir;
fs.exists = exists;
});
// src/
// |- main.rs
describe('one binary, src/main.rs', async () => {
beforeAll(() => {
fs.readdir = jest.fn((p, cb) => cb(null, ['main.rs']));
});
it('infers only one binary', async () => {
const toml = {
package: {
name: 'foo',
},
};
expect(inferCargoBinaries(toml, '/path/to/src')).resolves.toEqual(['foo']);
});
});
// [[bin]] sections in `Cargo.toml`
// `main.rs` -> `package.name`
// `bar.rs` -> `bin.name`
// src/
// |- bar.rs
// |- main.rs
describe('two binaries, src/main.rs, src/bar.rs', async () => {
beforeAll(() => {
fs.readdir = jest.fn((p, cb) => cb(null, ['main.rs', 'bar.rs']));
});
it('infers two binaries', async () => {
const toml = {
package: {
name: 'foo',
},
bin: [{ name: 'bar', path: 'src/bar.rs' }],
};
expect((await inferCargoBinaries(toml, '/path/to/src')).sort()).toEqual([
'bar',
'foo',
]);
});
});
// no main.rs
// src/
// |- foo.rs
describe('one named binary, no main.rs', async () => {
beforeAll(() => {
fs.readdir = jest.fn((p, cb) => cb(null, ['bar.rs']));
});
it('infers only one binary', async () => {
const toml = {
package: {
name: 'foo',
},
bin: [{ name: 'bar', path: 'src/bar.rs' }],
};
expect((await inferCargoBinaries(toml, '/path/to/src')).sort()).toEqual([
'bar',
]);
});
});
// `src/bin` folder
// src/
// |- bin/
// | |- bar.rs
// | |- baz.rs
// |- main.rs
describe('multiple binaries in bin/, no [[bin]] section', async () => {
beforeAll(() => {
fs.readdir = jest.fn((p, cb) => {
if (p === '/path/to/src') {
return cb(null, ['bin', 'main.rs']);
}
if (p === '/path/to/src/bin') {
return cb(null, ['bar.rs', 'baz.rs']);
}
return cb('some error');
});
fs.exists = jest.fn((p, cb) => cb(null, p.endsWith('bin')));
fs.stat = jest.fn((_, cb) => cb(null, new fs.Stats()));
fs.Stats.prototype.isDirectory = jest.fn(() => true);
});
it('infers three binaries', async () => {
const toml = {
package: {
name: 'foo',
},
};
expect((await inferCargoBinaries(toml, '/path/to/src')).sort()).toEqual([
'bar',
'baz',
'foo',
]);
});
});
// `src/bin` folder, bin sections ignore baz.rs
// src/
// |- bin/
// | |- bar.rs
// | |- baz.rs
// |- main.rs
describe('src/bin exists but one binary is ignored', async () => {
beforeAll(() => {
fs.readdir = jest.fn((p, cb) => {
if (p === '/path/to/src') {
return cb(null, ['bin', 'main.rs']);
}
if (p === '/path/to/src/bin') {
return cb(null, ['bar.rs', 'baz.rs']);
}
return cb('some error');
});
fs.exists = jest.fn((p, cb) => cb(null, p.endsWith('bin')));
fs.stat = jest.fn((_, cb) => cb(null, new fs.Stats()));
fs.Stats.prototype.isDirectory = jest.fn(() => true);
});
it('infers only one binary', async () => {
const toml = {
package: {
name: 'foo',
},
bin: [{ name: 'bar', path: 'src/bar.rs' }],
};
expect((await inferCargoBinaries(toml, '/path/to/src')).sort()).toEqual([
'bar',
'foo',
]);
});
});

View File

@@ -1,12 +1,22 @@
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
const path = require('path');
const { existsSync } = require('fs');
const {
runNpmInstall,
runPackageJsonScript,
runShellScript,
} = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
function validateDistDir(distDir) {
const distDirName = path.basename(distDir);
if (!existsSync(distDir)) {
const message = `Build was unable to create the distDir: ${distDirName}.`
+ '\nMake sure you mentioned the correct dist directory: https://zeit.co/docs/v2/deployments/official-builders/static-build-now-static-build/#configuring-the-dist-directory';
throw new Error(message);
}
}
exports.build = async ({
files, entrypoint, workPath, config,
}) => {
@@ -24,6 +34,7 @@ exports.build = async ({
if (path.basename(entrypoint) === 'package.json') {
await runNpmInstall(entrypointFsDirname, ['--prefer-offline']);
if (await runPackageJsonScript(entrypointFsDirname, 'now-build')) {
validateDistDir(distPath);
return glob('**', distPath, mountpoint);
}
throw new Error(`An error running "now-build" script in "${entrypoint}"`);
@@ -31,6 +42,7 @@ exports.build = async ({
if (path.extname(entrypoint) === '.sh') {
await runShellScript(path.join(workPath, entrypoint));
validateDistDir(distPath);
return glob('**', distPath, mountpoint);
}

View File

@@ -1,6 +1,6 @@
{
"name": "@now/static-build",
"version": "0.4.18-canary.0",
"version": "0.4.19-canary.1",
"license": "MIT",
"repository": {
"type": "git",

View File

@@ -0,0 +1,6 @@
{
"version": 2,
"builds": [
{ "src": "package.json", "use": "@now/static-build", "config": {"distDir": "out"} }
]
}

View File

@@ -0,0 +1,8 @@
{
"dependencies": {
"cowsay": "^1.3.1"
},
"scripts": {
"now-build": "mkdir dist && cowsay cow:RANDOMNESS_PLACEHOLDER > dist/index.txt"
}
}

View File

@@ -0,0 +1,8 @@
{
"dependencies": {
"yodasay": "^1.1.6"
},
"scripts": {
"now-build": "mkdir dist && yodasay yoda:RANDOMNESS_PLACEHOLDER > dist/index.txt"
}
}

View File

@@ -7,7 +7,7 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(2 * 60 * 1000);
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;
@@ -21,6 +21,21 @@ const fixturesPath = path.resolve(__dirname, 'fixtures');
// eslint-disable-next-line no-restricted-syntax
for (const fixture of fs.readdirSync(fixturesPath)) {
if (fixture === '04-wrong-dist-dir') {
// eslint-disable-next-line no-loop-func
it(`should not build ${fixture}`, async () => {
try {
await testDeployment(
{ builderUrl, buildUtilsUrl },
path.join(fixturesPath, fixture),
);
} catch (err) {
expect(err.message).toMatch(/is ERROR/);
}
});
continue; //eslint-disable-line
}
// eslint-disable-next-line no-loop-func
it(`should build ${fixture}`, async () => {
await expect(

View File

@@ -0,0 +1 @@
module.exports = () => 'Hello!';

View File

@@ -1,9 +1,9 @@
{
"name": "monorepo",
"dependencies": {
"next": "^8.0.0-canary.2",
"react": "^16.7.0",
"react-dom": "^16.7.0"
"next": "^8.0.0",
"react": "^16.8.0",
"react-dom": "^16.8.0"
},
"scripts": {
"now-build": "next build"

View File

@@ -1 +1,3 @@
export default () => 'Index page';
import hello from '../../shared/hello';
export default () => `${hello()} Welcome to the index page`;

View File

@@ -18,8 +18,9 @@ async function nowDeploy (bodies, randomness) {
const nowDeployPayload = {
version: 2,
env: Object.assign({}, nowJson.env, { RANDOMNESS_ENV_VAR: randomness }),
build: { env: { RANDOMNESS_BUILD_ENV_VAR: randomness } },
public: true,
env: { ...nowJson.env, RANDOMNESS_ENV_VAR: randomness },
build: { env: { ...(nowJson.build || {}).env, RANDOMNESS_BUILD_ENV_VAR: randomness } },
name: 'test',
files,
builds: nowJson.builds,

View File

@@ -2,11 +2,7 @@ const {
excludeFiles,
validateEntrypoint,
includeOnlyEntryDirectory,
moveEntryDirectoryToRoot,
excludeLockFiles,
normalizePackageJson,
excludeStaticDirectory,
onlyStaticDirectory,
} = require('@now/next/utils');
const FileRef = require('@now/build-utils/file-ref'); // eslint-disable-line import/no-extraneous-dependencies
@@ -46,7 +42,7 @@ describe('validateEntrypoint', () => {
});
describe('includeOnlyEntryDirectory', () => {
it('should exclude files outside entry directory', () => {
it('should include files outside entry directory', () => {
const entryDirectory = 'frontend';
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
@@ -58,149 +54,6 @@ describe('includeOnlyEntryDirectory', () => {
expect(result['package.json']).toBeUndefined();
expect(result['package-lock.json']).toBeUndefined();
});
it('should handle entry directory being dot', () => {
const entryDirectory = '.';
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'package-lock.json': new FileRef({ digest: 'package-lock' }),
};
const result = includeOnlyEntryDirectory(files, entryDirectory);
expect(result['frontend/pages/index.js']).toBeDefined();
expect(result['package.json']).toBeDefined();
expect(result['package-lock.json']).toBeDefined();
});
});
describe('moveEntryDirectoryToRoot', () => {
it('should move entrydirectory files to the root', () => {
const entryDirectory = 'frontend';
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
};
const result = moveEntryDirectoryToRoot(files, entryDirectory);
expect(result['pages/index.js']).toBeDefined();
});
it('should work with deep nested subdirectories', () => {
const entryDirectory = 'frontend/my/app';
const files = {
'frontend/my/app/pages/index.js': new FileRef({ digest: 'index' }),
};
const result = moveEntryDirectoryToRoot(files, entryDirectory);
expect(result['pages/index.js']).toBeDefined();
});
it('should do nothing when entry directory is dot', () => {
const entryDirectory = '.';
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
};
const result = moveEntryDirectoryToRoot(files, entryDirectory);
expect(result['frontend/pages/index.js']).toBeDefined();
});
});
describe('excludeLockFiles', () => {
it('should remove package-lock.json', () => {
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'package-lock.json': new FileRef({ digest: 'package-lock' }),
};
const result = excludeLockFiles(files);
expect(result['frontend/pages/index.js']).toBeDefined();
expect(result['package-lock.json']).toBeUndefined();
});
it('should remove yarn.lock', () => {
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
};
const result = excludeLockFiles(files);
expect(result['frontend/pages/index.js']).toBeDefined();
expect(result['yarn.lock']).toBeUndefined();
});
it('should remove both package-lock.json and yarn.lock', () => {
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
'package-lock.json': new FileRef({ digest: 'package-lock' }),
};
const result = excludeLockFiles(files);
expect(result['frontend/pages/index.js']).toBeDefined();
expect(result['yarn.lock']).toBeUndefined();
expect(result['package-lock.json']).toBeUndefined();
});
});
describe('excludeStaticDirectory', () => {
it('should remove the /static directory files', () => {
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
'package-lock.json': new FileRef({ digest: 'package-lock' }),
'static/image.png': new FileRef({ digest: 'image' }),
};
const result = excludeStaticDirectory(files);
expect(result['frontend/pages/index.js']).toBeDefined();
expect(result['yarn.lock']).toBeDefined();
expect(result['package-lock.json']).toBeDefined();
expect(result['static/image.png']).toBeUndefined();
});
it('should remove the nested /static directory files', () => {
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
'package-lock.json': new FileRef({ digest: 'package-lock' }),
'static/images/png/image.png': new FileRef({ digest: 'image' }),
};
const result = excludeStaticDirectory(files);
expect(result['frontend/pages/index.js']).toBeDefined();
expect(result['yarn.lock']).toBeDefined();
expect(result['package-lock.json']).toBeDefined();
expect(result['static/images/png/image.png']).toBeUndefined();
});
});
describe('onlyStaticDirectory', () => {
it('should keep only /static directory files', () => {
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
'package-lock.json': new FileRef({ digest: 'package-lock' }),
'static/image.png': new FileRef({ digest: 'image' }),
};
const result = onlyStaticDirectory(files);
expect(result['frontend/pages/index.js']).toBeUndefined();
expect(result['yarn.lock']).toBeUndefined();
expect(result['package-lock.json']).toBeUndefined();
expect(result['static/image.png']).toBeDefined();
});
it('should keep nested /static directory files', () => {
const files = {
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
'package.json': new FileRef({ digest: 'package' }),
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
'package-lock.json': new FileRef({ digest: 'package-lock' }),
'static/images/png/image.png': new FileRef({ digest: 'image' }),
};
const result = onlyStaticDirectory(files);
expect(result['frontend/pages/index.js']).toBeUndefined();
expect(result['yarn.lock']).toBeUndefined();
expect(result['package-lock.json']).toBeUndefined();
expect(result['static/images/png/image.png']).toBeDefined();
});
});
describe('normalizePackageJson', () => {

906
yarn.lock

File diff suppressed because it is too large Load Diff