Compare commits

...

20 Commits

Author SHA1 Message Date
Steven
99410a0c06 Publish
- @now/build-utils@0.9.11
 - @now/go@0.5.8
 - @now/next@0.5.9
 - @now/node@0.12.4
 - @now/python@0.2.14
 - @now/ruby@0.1.4
 - @now/static-build@0.9.5
2019-08-07 08:28:33 -07:00
Timothy
9df2e3a62d [docs] Add Builders Developer Reference (#888)
* Add Builders developer reference

* Updates to the README
2019-08-07 08:27:59 -07:00
Steven
3cc786412d [now-bash] Move source code to external repository (#887)
* [now-bash] Remove source code

* Change tests to run from root instead of dist
2019-08-07 08:27:51 -07:00
Steven
973229e02e [now-node][now-next][now-static-build] Enable node 10 for zero c… (#879) 2019-08-07 08:27:10 -07:00
Steven
1493b0657b [now-build-utils] Default to Node 10 for zero config (#773)
* [now-build-utils] Default to Node 10

* Revert back to 8

* Enable node 10 for zero config

* Add silent flag
2019-08-07 08:26:56 -07:00
Steven
66df087faa [now-python] Use ncc before publishing to npm (#875) 2019-08-07 08:26:43 -07:00
Steven
4401799eb3 [now-ruby] Use ncc before publishing to npm (#877) 2019-08-07 08:26:37 -07:00
Steven
313cad8e20 [now-go] Use ncc before publishing to npm (#876)
* [now-go] Use ncc before publishing to npm

* Add missing main
2019-08-07 08:26:15 -07:00
Steven
3a51773693 [now-php][now-rust] Remove deprecated community builders (#871)
* Remove now-rust

* Remove now-php

* Remove now-php-bridge

* Remove unused eslintignore

* Change a comment
2019-08-07 08:25:56 -07:00
Andy Bitz
ddab628034 Publish
- @now/build-utils@0.9.10
2019-08-06 19:33:36 +02:00
Steven
2c10cdcbca [now-build-utils] Fix TS globbing for zero config (#886)
* [now-build-utils] Fix TS globbing for zero config

* Fix tests

* [now-build-utils] Fix TS globbing for zero config
2019-08-06 19:30:13 +02:00
Steven
c30eac53f1 [now-build-utils] Fix TS globbing for zero config (#885)
* [now-build-utils] Fix TS globbing for zero config

* Fix tests
2019-08-06 19:30:07 +02:00
Andy Bitz
2d3e32e95b Publish
- @now/build-utils@0.9.9
2019-08-06 03:01:43 +02:00
Andy
bd8d41cadc [now-build-utils] Fix 404 for index routes (#882) 2019-08-06 03:01:06 +02:00
Andy Bitz
0a6e7d8e23 Publish
- @now/static-build@0.9.4
2019-08-05 16:12:58 +02:00
Andy
e0a8cb5011 [now-static-build] Choose public dir when there is no dist dir (#874)
* [now-static-build] Choose `public` dir when there is no `dist` dir

* Fix `path.join`
2019-08-05 16:12:30 +02:00
Andy Bitz
bcdc27139f Publish
- @now/build-utils@0.9.8
2019-08-03 00:31:02 +02:00
Andy
8cfaef7e6c [now-build-utils] Fix 404 route (#872) 2019-08-03 00:30:33 +02:00
Andy Bitz
79c096b80e Publish
- @now/build-utils@0.9.7
2019-08-02 22:32:16 +02:00
Andy
2cacb95c7d [now-build-utils] Disable directory listing for api routes (#870) 2019-08-02 22:26:37 +02:00
110 changed files with 654 additions and 4851 deletions

View File

@@ -9,10 +9,8 @@
/packages/now-next/dist/*
/packages/now-node-bridge/*
/packages/now-python/dist/*
/packages/now-optipng/dist/*
/packages/now-go/*
/packages/now-rust/dist/*
/packages/now-ruby/dist/*
/packages/now-static-build/dist/*
/packages/now-static-build/test/fixtures/**
/packages/now-routing-utils/dist/*
/packages/now-routing-utils/dist/*

1
.github/CODEOWNERS vendored
View File

@@ -8,7 +8,6 @@
/packages/now-next @timer
/packages/now-go @styfle @sophearak
/packages/now-python @styfle @sophearak
/packages/now-rust @styfle @mike-engel @anmonteiro
/packages/now-ruby @styfle @coetry @nathancahill
/packages/now-static-build @styfle @AndyBitz
/packages/now-routing-utils @dav-is

447
DEVELOPING_A_BUILDER.md Normal file
View File

@@ -0,0 +1,447 @@
# Builders Developer Reference
The following page is a reference for how to create a Builder using the available Builder's API.
A Builder is an npm module that exposes a `build` function and optionally an `analyze` function and `prepareCache` function.
Official Builders are published to [npmjs.com](https://npmjs.com) as a package and referenced in the `use` property of the `now.json` configuration file.
However, the `use` property will work with any [npm install argument](https://docs.npmjs.com/cli/install) such as a git repo url which is useful for testing your Builder.
See the [Builders Documentation](https://zeit.co/docs/v2/advanced/builders) to view example usage.
## Builder Exports
### `version`
A **required** exported constant that decides which version of the Builder API to use.
The latest and suggested version is `2`.
### `analyze`
An **optional** exported function that returns a unique fingerprint used for the purpose of [build de-duplication](https://zeit.co/docs/v2/advanced/concepts/immutability#deduplication-algorithm). If the `analyze` function is not supplied, a random fingerprint is assigned to each build.
```js
export analyze({
files: Files,
entrypoint: String,
workPath: String,
config: Object
}) : String fingerprint
```
If you are using TypeScript, you should use the following types:
```ts
import { AnalyzeOptions } from '@now/build-utils'
export analyze(options: AnalyzeOptions) {
return 'fingerprint goes here'
}
```
### `build`
A **required** exported function that returns a [Files](#files) data structure that contains the Build outputs, which can be a [Static File](#file) or a [Serverless Function](#serverless-function).
What's a Serverless Function? Read about [Serverless Function concepts](https://zeit.co/docs/v2/deployments/concepts/lambdas) to learn more.
```js
build({
files: Files,
entrypoint: String,
workPath: String,
config: Object,
meta?: {
isDev?: Boolean,
requestPath?: String,
filesChanged?: Array<String>,
filesRemoved?: Array<String>
}
}) : {
watch: Array<String>,
output: Files output,
routes: Object
}
```
If you are using TypeScript, you should use the following types:
```ts
import { BuildOptions } from '@now/build-utils'
export build(options: BuildOptions) {
// Build the code here
return {
output: {
'path-to-file': File,
'path-to-lambda': Lambda
},
watch: [],
routes: {}
}
}
```
### `prepareCache`
An **optional** exported function that is equivalent to [`build`](#build), but it executes the instructions necessary to prepare a cache for the next run.
```js
prepareCache({
files: Files,
entrypoint: String,
workPath: String,
cachePath: String,
config: Object
}) : Files cacheOutput
```
If you are using TypeScript, you can import the types for each of these functions by using the following:
```ts
import { PrepareCacheOptions } from '@now/build-utils'
export prepareCache(options: PrepareCacheOptions) {
return { 'path-to-file': File }
}
```
### `shouldServe`
An **optional** exported function that is only used by `now dev` in [Now CLI](https:///download) and indicates whether a [Builder](https://zeit.co/docs/v2/advanced/builders) wants to be responsible for building a certain request path.
```js
shouldServe({
entrypoint: String,
files: Files,
config: Object,
requestPath: String,
workPath: String
}) : Boolean
```
If you are using TypeScript, you can import the types for each of these functions by using the following:
```ts
import { ShouldServeOptions } from '@now/build-utils'
export shouldServe(options: ShouldServeOptions) {
return Boolean
}
```
If this method is not defined, Now CLI will default to [this function](https://github.com/zeit/now-builders/blob/52994bfe26c5f4f179bdb49783ee57ce19334631/packages/now-build-utils/src/should-serve.ts).
### Builder Options
The exported functions [`analyze`](#analyze), [`build`](#build), and [`prepareCache`](#preparecache) receive one argument with the following properties.
**Properties:**
- `files`: All source files of the project as a [Files](#files) data structure.
- `entrypoint`: Name of entrypoint file for this particular build job. Value `files[entrypoint]` is guaranteed to exist and be a valid [File](#files) reference. `entrypoint` is always a discrete file and never a glob, since globs are expanded into separate builds at deployment time.
- `workPath`: A writable temporary directory where you are encouraged to perform your build process. This directory will be populated with the restored cache from the previous run (if any) for [`analyze`](#analyze) and [`build`](#build).
- `cachePath`: A writable temporary directory where you can build a cache for the next run. This is only passed to `prepareCache`.
- `config`: An arbitrary object passed from by the user in the [Build definition](#defining-the-build-step) in `now.json`.
## Example: html-minifier
Let's walk through what it takes to create a simple builder that takes in a HTML source file and yields a minified HTML static file as its build output.
While this is a very simple builder, the approach demonstrated here can be used to return anything: one or more static files and/or one or more lambdas.
## Setting up the module
### Defining the analyze step
The `analyze` hook is optional. Its goal is to give the developer a tool to avoid wasting time _re-computing a build_ that has already occurred.
The return value of `analyze` is a _fingerprint_: a simple string that uniquely identifies the build process.
If `analyze` is not specified, its behavior is to use as the fingerprint the combined checksums of **all the files in the same directory level as the entrypoint**. This is a default that errs on making sure that we re-execute builds when files _other than the entrypoint_ (like dependencies, manifest files, etc) have changed.
For our `html-minify` example, we know that HTML files don't have dependencies. Therefore, our analyze step can just return the `digest` of the entrypoint.
Our `index.js` file looks as follows:
```js
exports.analyze = function({ files, entrypoint }) {
return files[entrypoint].digest
}
```
This means that we will only re-minify and re-create the build output _only if the file contents (and therefore its digest) change._
### Defining the build step
Your module will need some utilities to manipulate the data structures we pass you, create new ones and alter the filesystem.
To that end, we expose our API as part of a `@now/build-utils` package. This package is always loaded on your behalf, so make sure it's only included as `peerDependencies` in your `package.json`.
Builders can include dependencies of their liking:
```js
const htmlMinifier = require('html-minifier')
exports.version = 2
exports.analyze = ({ files, entrypoint }) => files[entrypoint].digest
exports.build = async ({ files, entrypoint, config }) => {
const stream = files[entrypoint].toStream()
const options = Object.assign({}, config || {})
const { data } = await FileBlob.fromStream({ stream })
const content = data.toString()
const minified = htmlMinifier(content, options)
const result = new FileBlob({ data: minified })
return {
output: {
[entrypoint]: result
},
watch: [],
routes: {}
}
}
```
### Defining a `prepareCache` step
If our builder had performed work that could be re-used in the next build invocation, we could define a `prepareCache` step.
In this case, there are not intermediate artifacts that we can cache, and our `analyze` step already takes care of caching the full output based on the fingerprint of the input.
## Technical Details
### Execution Context
A [Serverless Function](https://zeit.co/docs/v2/advanced/concepts/lambdas) is created where the builder logic is executed. The lambda is run using the Node.js 8 runtime. A brand new sandbox is created for each deployment, for security reasons. The sandbox is cleaned up between executions to ensure no lingering temporary files are shared from build to build.
All the APIs you export ([`analyze`](#analyze), [`build`](#build) and [`prepareCache`](#preparecache)) are not guaranteed to be run in the same process, but the filesystem we expose (e.g.: `workPath` and the results of calling [`getWriteableDirectory`](#getWriteableDirectory) ) is retained.
If you need to share state between those steps, use the filesystem.
### Directory and Cache Lifecycle
When a new build is created, we pre-populate the `workPath` supplied to `analyze` with the results of the `prepareCache` step of the previous build.
The `analyze` step can modify that directory, and it will not be re-created when it's supplied to `build` and `prepareCache`.
To learn how the cache key is computed and invalidated, refer to the [overview](https://zeit.co/docs/v2/advanced/builders#technical-details).
### Accessing Environment and Secrets
The env and secrets specified by the user as `build.env` are passed to the builder process. This means you can access user env via `process.env` in Node.js.
### Utilities as peerDependencies
When you publish your builder to npm, make sure to not specify `@now/build-utils` (as seen below in the API definitions) as a dependency, but rather as part of `peerDependencies`.
## Types
### `Files`
```ts
import { File } from '@now/build-utils'
type Files = { [filePath: string]: File }
```
This is an abstract type that is implemented as a plain [JavaScript Object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object). It's helpful to think of it as a virtual filesystem representation.
When used as an input, the `Files` object will only contain `FileRefs`. When `Files` is an output, it may consist of `Lambda` (Serverless Functions) types as well as `FileRefs`.
An example of a valid output `Files` object is:
```json
{
"index.html": FileRef,
"api/index.js": Lambda
}
```
### `File`
This is an abstract type that can be imported if you are using TypeScript.
```ts
import { File } from '@now/build-utils'
```
Valid `File` types include:
- [`FileRef`](#fileref)
- [`FileFsRef`](#filefsref)
- [`FileBlob`](#fileblob)
### `FileRef`
```ts
import { FileRef } from '@now/build-utils'
```
This is a [JavaScript class](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes) that represents an abstract file instance stored in our platform, based on the file identifier string (its checksum). When a `Files` object is passed as an input to `analyze` or `build`, all its values will be instances of `FileRef`.
**Properties:**
- `mode : Number` file mode
- `digest : String` a checksum that represents the file
**Methods:**
- `toStream() : Stream` creates a [Stream](https://nodejs.org/api/stream.html) of the file body
### `FileFsRef`
```ts
import { FileFsRef } from '@now/build-utils'
```
This is a [JavaScript class](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes) that represents an abstract instance of a file present in the filesystem that the build process is executing in.
**Properties:**
- `mode : Number` file mode
- `fsPath : String` the absolute path of the file in file system
**Methods:**
- `static async fromStream({ mode : Number, stream : Stream, fsPath : String }) : FileFsRef` creates an instance of a [FileFsRef](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object) from `Stream`, placing file at `fsPath` with `mode`
- `toStream() : Stream` creates a [Stream](https://nodejs.org/api/stream.html) of the file body
### `FileBlob`
```ts
import { FileBlob } from '@now/build-utils'
```
This is a [JavaScript class](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes) that represents an abstract instance of a file present in memory.
**Properties:**
- `mode : Number` file mode
- `data : String | Buffer` the body of the file
**Methods:**
- `static async fromStream({ mode : Number, stream : Stream }) :FileBlob` creates an instance of a [FileBlob](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object) from [`Stream`](https://nodejs.org/api/stream.html) with `mode`
- `toStream() : Stream` creates a [Stream](https://nodejs.org/api/stream.html) of the file body
### `Lambda`
```ts
import { Lambda } from '@now/build-utils'
```
This is a [JavaScript class](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes), called a Serverless Function, that can be created by supplying `files`, `handler`, `runtime`, and `environment` as an object to the [`createLambda`](#createlambda) helper. The instances of this class should not be created directly. Instead use a call to [`createLambda`](#createlambda).
**Properties:**
- `files : Files` the internal filesystem of the lambda
- `handler : String` path to handler file and (optionally) a function name it exports
- `runtime : LambdaRuntime` the name of the lambda runtime
- `environment : Object` key-value map of handler-related (aside of those passed by user) environment variables
### `LambdaRuntime`
This is an abstract enumeration type that is implemented by one of the following possible `String` values:
- `nodejs10.x`
- `nodejs8.10`
- `go1.x`
- `java-1.8.0-openjdk`
- `python3.6`
- `python2.7`
- `dotnetcore2.1`
- `dotnetcore2.0`
- `dotnetcore1.0`
## JavaScript API
The following is exposed by `@now/build-utils` to simplify the process of writing Builders, manipulating the file system, using the above types, etc.
### `createLambda`
Signature: `createLambda(Object spec) : Lambda`
```ts
import { createLambda } from '@now/build-utils'
```
Constructor for the [`Lambda`](#lambda) type.
```js
const { createLambda, FileBlob } = require('@now/build-utils')
await createLambda({
runtime: 'nodejs8.10',
handler: 'index.main',
files: {
'index.js': new FileBlob({ data: 'exports.main = () => {}' })
}
})
```
### `download`
Signature: `download() : Files`
```ts
import { download } from '@now/build-utils'
```
This utility allows you to download the contents of a [`Files`](#files) data structure, therefore creating the filesystem represented in it.
Since `Files` is an abstract way of representing files, you can think of `download` as a way of making that virtual filesystem _real_.
If the **optional** `meta` property is passed (the argument for [build](#build)), only the files that have changed are downloaded. This is decided using `filesRemoved` and `filesChanged` inside that object.
```js
await download(files, workPath, meta)
```
### `glob`
Signature: `glob() : Files`
```ts
import { glob } from '@now/build-utils'
```
This utility allows you to _scan_ the filesystem and return a [`Files`](#files) representation of the matched glob search string. It can be thought of as the reverse of [`download`](#download).
The following trivial example downloads everything to the filesystem, only to return it back (therefore just re-creating the passed-in [`Files`](#files)):
```js
const { glob, download } = require('@now/build-utils')
exports.build = ({ files, workPath }) => {
await download(files, workPath)
return glob('**', workPath)
}
```
### `getWriteableDirectory`
Signature: `getWriteableDirectory() : String`
```ts
import { getWriteableDirectory } from '@now/build-utils'
```
In some occasions, you might want to write to a temporary directory.
### `rename`
Signature: `rename(Files) : Files`
```ts
import { rename } from '@now/build-utils'
```
Renames the keys of the [`Files`](#files) object, which represent the paths. For example, to remove the `*.go` suffix you can use:
```js
const rename = require('@now/build-utils')
const originalFiles = { 'one.go': fileFsRef1, 'two.go': fileFsRef2 }
const renamedFiles = rename(originalFiles, path => path.replace(/\.go$/, '')
```

View File

@@ -1,6 +1,6 @@
# now-builders
This is a monorepo containing the [Official Builders](https://zeit.co/docs/v2/deployments/builders/overview) provided by the ZEIT team.
This is a monorepo containing the [Official Builders](https://zeit.co/docs/v2/advanced/builders) provided by the ZEIT team.
## Channels
@@ -59,3 +59,7 @@ use `npm publish --tag canary` if you are publishing a canary release!
### Contributing
See the [Contribution guidelines for this project](CONTRIBUTING.md), it also contains guidance on interpreting tests failures.
### Creating Your Own Builder
To create your own Builder, see [the Builder's Developer Reference](DEVELOPING_A_BUILDER.md).

View File

@@ -1,32 +0,0 @@
root = true
[*]
indent_style = tab
indent_size = 4
tab_width = 4
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
[{*.json,*.json.example,*.gyp,*.yml}]
indent_style = space
indent_size = 2
[*.py]
indent_style = space
indent_size = 4
[*.md]
trim_trailing_whitespace = false
# Ideal settings - some plugins might support these.
[*.js]
quote_type = single
[{*.c,*.cc,*.h,*.hh,*.cpp,*.hpp,*.m,*.mm,*.mpp,*.js,*.java,*.go,*.rs,*.php,*.ng,*.jsx,*.ts,*.d,*.cs,*.swift}]
curly_bracket_next_line = false
spaces_around_operators = true
spaces_around_brackets = outside
# close enough to 1TB
indent_brace_style = K&R

View File

@@ -1,2 +0,0 @@
node_modules
handler

View File

@@ -1,16 +0,0 @@
#!/bin/bash
set -euo pipefail
cd "$LAMBDA_TASK_ROOT"
# Configure `import`
export IMPORT_CACHE="$LAMBDA_TASK_ROOT/.import-cache"
export PATH="$IMPORT_CACHE/bin:$PATH"
# Load `import` and runtime
# shellcheck disable=SC1090
. "$(which import)"
# shellcheck disable=SC1090
. "$IMPORT_CACHE/runtime.sh"
# Load user code and process events in a loop forever
_lambda_runtime_init

View File

@@ -1,40 +0,0 @@
#!/bin/bash
set -euo pipefail
# `import` debug logs are always enabled during build
export IMPORT_DEBUG=1
# Install `import`
IMPORT_BIN="$IMPORT_CACHE/bin/import"
mkdir -p "$(dirname "$IMPORT_BIN")"
curl -sfLS https://import.pw > "$IMPORT_BIN"
chmod +x "$IMPORT_BIN"
# For now only the entrypoint file is copied into the lambda
mkdir -p "$(dirname "$DIST/$ENTRYPOINT")"
cp "$ENTRYPOINT" "$DIST/$ENTRYPOINT"
# Copy in the runtime
cp "$BUILDER/runtime.sh" "$IMPORT_CACHE"
cp "$BUILDER/bootstrap" "$DIST"
# Load `import`
. "$(which import)"
# Cache runtime and user dependencies
echo "Caching imports in \"$ENTRYPOINT\"…"
. "$IMPORT_CACHE/runtime.sh"
. "$DIST/$ENTRYPOINT"
echo "Done caching imports"
# Run user build script
if declare -f build > /dev/null; then
echo "Running \`build\` function in \"$ENTRYPOINT\"…"
build "$@"
fi
# Ensure the entrypoint defined a `handler` function
if ! declare -f handler > /dev/null; then
echo "ERROR: A \`handler\` function must be defined in \"$ENTRYPOINT\"!" >&2
exit 1
fi

View File

@@ -1,79 +0,0 @@
const execa = require('execa');
const { join } = require('path');
const snakeCase = require('snake-case');
const {
glob,
download,
createLambda,
shouldServe,
} = require('@now/build-utils'); // eslint-disable-line import/no-extraneous-dependencies
// From this list: https://import.pw/importpw/import/docs/config.md
const allowedConfigImports = new Set([
'CACHE',
'CURL_OPTS',
'DEBUG',
'RELOAD',
'SERVER',
]);
exports.analyze = ({ files, entrypoint }) => files[entrypoint].digest;
exports.build = async ({
workPath, files, entrypoint, meta, config,
}) => {
console.log('downloading files...');
await download(files, workPath, meta);
const distPath = join(workPath, 'dist');
const configEnv = Object.keys(config).reduce((o, v) => {
const name = snakeCase(v).toUpperCase();
if (allowedConfigImports.has(name)) {
o[`IMPORT_${name}`] = config[v]; // eslint-disable-line no-param-reassign
}
return o;
}, {});
if (config && config.import) {
Object.keys(config.import).forEach((key) => {
const name = snakeCase(key).toUpperCase();
// eslint-disable-next-line no-param-reassign
configEnv[`IMPORT_${name}`] = config.import[key];
});
}
const IMPORT_CACHE = `${distPath}/.import-cache`;
const env = Object.assign({}, process.env, configEnv, {
PATH: `${IMPORT_CACHE}/bin:${process.env.PATH}`,
IMPORT_CACHE,
DIST: distPath,
BUILDER: __dirname,
ENTRYPOINT: entrypoint,
});
const builderPath = join(__dirname, 'builder.sh');
await execa(builderPath, [entrypoint], {
env,
cwd: workPath,
stdio: 'inherit',
});
const lambda = await createLambda({
files: await glob('**', distPath),
handler: entrypoint, // not actually used in `bootstrap`
runtime: 'provided',
environment: Object.assign({}, configEnv, {
SCRIPT_FILENAME: entrypoint,
}),
});
return {
[entrypoint]: lambda,
};
};
exports.shouldServe = shouldServe;

View File

@@ -1,28 +0,0 @@
{
"name": "@now/bash",
"version": "1.0.3",
"description": "Now 2.0 builder for HTTP endpoints written in Bash",
"main": "index.js",
"author": "Nathan Rajlich <nate@zeit.co>",
"license": "MIT",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/bash-now-bash",
"repository": {
"type": "git",
"url": "https://github.com/zeit/now-builders.git",
"directory": "packages/now-bash"
},
"files": [
"builder.sh",
"runtime.sh",
"bootstrap",
"index.js",
"package.json"
],
"dependencies": {
"execa": "^1.0.0",
"snake-case": "^2.1.0"
},
"scripts": {
"test": "jest"
}
}

View File

@@ -1,119 +0,0 @@
#!/bin/bash
import "static-binaries@1.0.0"
static_binaries jq
# These get reset upon each request
_STATUS_CODE="$(mktemp)"
_HEADERS="$(mktemp)"
_lambda_runtime_api() {
local endpoint="$1"
shift
curl -sfLS "http://$AWS_LAMBDA_RUNTIME_API/2018-06-01/runtime/$endpoint" "$@"
}
_lambda_runtime_init() {
# Initialize user code
# shellcheck disable=SC1090
. "$SCRIPT_FILENAME" || {
local exit_code="$?"
local error_message="Initialization failed for '$SCRIPT_FILENAME' (exit code $exit_code)"
echo "$error_message" >&2
local error='{"errorMessage":"'"$error_message"'"}'
_lambda_runtime_api "init/error" -X POST -d "$error"
exit "$exit_code"
}
# Process events
while true; do _lambda_runtime_next; done
}
_lambda_runtime_next() {
echo 200 > "$_STATUS_CODE"
echo '{"content-type":"text/plain; charset=utf8"}' > "$_HEADERS"
local headers
headers="$(mktemp)"
# Get an event
local event
event="$(mktemp)"
_lambda_runtime_api invocation/next -D "$headers" | jq --raw-output --monochrome-output '.body' > "$event"
local request_id
request_id="$(grep -Fi Lambda-Runtime-Aws-Request-Id "$headers" | tr -d '[:space:]' | cut -d: -f2)"
rm -f "$headers"
# Execute the handler function from the script
local body
body="$(mktemp)"
# Stdin of the `handler` function is the HTTP request body.
# Need to use a fifo here instead of bash <() because Lambda
# errors with "/dev/fd/63 not found" for some reason :/
local stdin
stdin="$(mktemp -u)"
mkfifo "$stdin"
_lambda_runtime_body < "$event" > "$stdin" &
local exit_code=0
handler "$event" < "$stdin" > "$body" || exit_code="$?"
rm -f "$event" "$stdin"
if [ "$exit_code" -eq 0 ]; then
# Send the response
jq --raw-input --raw-output --compact-output --slurp --monochrome-output \
--arg statusCode "$(cat "$_STATUS_CODE")" \
--argjson headers "$(cat "$_HEADERS")" \
'{statusCode:$statusCode|tonumber, headers:$headers, encoding:"base64", body:.|@base64}' < "$body" \
| _lambda_runtime_api "invocation/$request_id/response" -X POST -d @- > /dev/null
rm -f "$body" "$_HEADERS"
else
local error_message="Invocation failed for 'handler' function in '$SCRIPT_FILENAME' (exit code $exit_code)"
echo "$error_message" >&2
_lambda_runtime_api "invocation/$request_id/error" -X POST -d '{"errorMessage":"'"$error_message"'"}' > /dev/null
fi
}
_lambda_runtime_body() {
local event
event="$(cat)"
if [ "$(jq --raw-output '.body | type' <<< "$event")" = "string" ]; then
if [ "$(jq --raw-output '.encoding' <<< "$event")" = "base64" ]; then
jq --raw-output '.body' <<< "$event" | base64 --decode
else
# assume plain-text body
jq --raw-output '.body' <<< "$event"
fi
fi
}
# Set the response status code.
http_response_code() {
echo "$1" > "$_STATUS_CODE"
}
# Sets a response header.
# Overrides existing header if it has already been set.
http_response_header() {
local name="$1"
local value="$2"
local tmp
tmp="$(mktemp)"
jq \
--arg name "$name" \
--arg value "$value" \
'.[$name] = $value' < "$_HEADERS" > "$tmp"
mv -f "$tmp" "$_HEADERS"
}
http_response_redirect() {
http_response_code "${2:-302}"
http_response_header "location" "$1"
}
http_response_json() {
http_response_header "content-type" "application/json; charset=utf8"
}

View File

@@ -1,3 +0,0 @@
handler() {
echo "cow:RANDOMNESS_PLACEHOLDER"
}

View File

@@ -1,11 +0,0 @@
{
"version": 2,
"builds": [
{ "src": "index.sh", "use": "@now/bash" },
{ "src": "subdirectory/index.sh", "use": "@now/bash" }
],
"probes": [
{ "path": "/", "mustContain": "cow:RANDOMNESS_PLACEHOLDER" },
{ "path": "/subdirectory/", "mustContain": "yoda:RANDOMNESS_PLACEHOLDER" }
]
}

View File

@@ -1,3 +0,0 @@
handler() {
echo "yoda:RANDOMNESS_PLACEHOLDER"
}

View File

@@ -1,33 +0,0 @@
/* global beforeAll, expect, it, jest */
const fs = require('fs');
const path = require('path');
const {
packAndDeploy,
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;
beforeAll(async () => {
const builderPath = path.resolve(__dirname, '..');
builderUrl = await packAndDeploy(builderPath);
console.log('builderUrl', builderUrl);
});
const fixturesPath = path.resolve(__dirname, 'fixtures');
// eslint-disable-next-line no-restricted-syntax
for (const fixture of fs.readdirSync(fixturesPath)) {
// eslint-disable-next-line no-loop-func
it(`should build ${fixture}`, async () => {
await expect(
testDeployment(
{ builderUrl, buildUtilsUrl },
path.join(fixturesPath, fixture),
),
).resolves.toBeDefined();
});
}

View File

@@ -1,6 +1,6 @@
{
"name": "@now/build-utils",
"version": "0.9.6",
"version": "0.9.11",
"license": "MIT",
"main": "./dist/index.js",
"types": "./dist/index.d.js",

View File

@@ -66,6 +66,10 @@ export function ignoreApiFilter(file: string) {
return false;
}
if (file.endsWith('.d.ts')) {
return false;
}
// If the file does not match any builder we also
// don't want to create a route e.g. `package.json`
if (API_BUILDERS.every(({ src }) => !minimatch(file, src))) {

View File

@@ -2,6 +2,16 @@ import { Route, Builder } from './types';
import { parse as parsePath } from 'path';
import { ignoreApiFilter, sortFiles } from './detect-builders';
function escapeName(name: string) {
const special = '[]^$.|?*+()'.split('');
for (const char of special) {
name = name.replace(new RegExp(`\\${char}`, 'g'), `\\${char}`);
}
return name;
}
function joinPath(...segments: string[]) {
const joinedPath = segments.join('/');
return joinedPath.replace(/\/{2,}/g, '/');
@@ -46,8 +56,14 @@ function createRouteFromPath(filePath: string): Route {
query.push(`${name}=$${counter++}`);
return `([^\\/]+)`;
} else if (isLast) {
const { name: fileName } = parsePath(segment);
return fileName;
const { name: fileName, ext } = parsePath(segment);
const isIndex = fileName === 'index';
// Either filename with extension, filename without extension
// or nothing when the filename is `index`
return `(${escapeName(fileName)}|${escapeName(fileName)}${escapeName(
ext
)})${isIndex ? '?' : ''}`;
}
return segment;
@@ -232,6 +248,14 @@ async function detectApiRoutes(files: string[]): Promise<RoutesResult> {
defaultRoutes.push(createRouteFromPath(file));
}
// 404 Route to disable directory listing
if (defaultRoutes.length) {
defaultRoutes.push({
status: 404,
src: '/api(\\/.*)?$',
});
}
return { defaultRoutes, error: null };
}

View File

@@ -4,7 +4,7 @@ import path from 'path';
import spawn from 'cross-spawn';
import { SpawnOptions } from 'child_process';
import { deprecate } from 'util';
import { Meta, PackageJson, NodeVersion } from '../types';
import { Meta, PackageJson, NodeVersion, Config } from '../types';
import { getSupportedNodeVersion } from './node-version';
function spawnAsync(
@@ -75,13 +75,23 @@ export function getSpawnOptions(
export async function getNodeVersion(
destPath: string,
minNodeVersion?: string
minNodeVersion?: string,
config?: Config
): Promise<NodeVersion> {
const { packageJson } = await scanParentDirs(destPath, true);
const range =
(packageJson && packageJson.engines && packageJson.engines.node) ||
minNodeVersion;
return getSupportedNodeVersion(range, typeof minNodeVersion !== 'undefined');
let range: string | undefined;
let silent = false;
if (packageJson && packageJson.engines && packageJson.engines.node) {
range = packageJson.engines.node;
} else if (minNodeVersion) {
range = minNodeVersion;
silent = true;
} else if (config && config.zeroConfig) {
// Use latest node version zero config detected
range = '10.x';
silent = true;
}
return getSupportedNodeVersion(range, silent);
}
async function scanParentDirs(destPath: string, readPackageJson = false) {

View File

@@ -4,7 +4,9 @@ const fs = require('fs-extra');
// eslint-disable-next-line import/no-extraneous-dependencies
const execa = require('execa');
const assert = require('assert');
const { glob, download } = require('../');
const {
glob, download, detectBuilders, detectRoutes,
} = require('../');
const { createZip } = require('../dist/lambda');
const {
getSupportedNodeVersion,
@@ -16,8 +18,6 @@ const {
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
const { detectBuilders, detectRoutes } = require('../dist');
jest.setTimeout(4 * 60 * 1000);
const builderUrl = '@canary';
let buildUtilsUrl;
@@ -454,9 +454,11 @@ it('Test `detectRoutes`', async () => {
const { builders } = await detectBuilders(files);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(defaultRoutes.length).toBe(2);
expect(defaultRoutes.length).toBe(3);
expect(defaultRoutes[0].dest).toBe('/api/team.js');
expect(defaultRoutes[1].dest).toBe('/api/user.go');
expect(defaultRoutes[2].dest).not.toBeDefined();
expect(defaultRoutes[2].status).toBe(404);
}
{
@@ -483,12 +485,21 @@ it('Test `detectRoutes`', async () => {
expect(error.code).toBe('conflicting_path_segment');
}
{
const files = ['api/date/index.js', 'api/date/index.go'];
const { builders } = await detectBuilders(files);
const { defaultRoutes, error } = await detectRoutes(files, builders);
expect(defaultRoutes).toBe(null);
expect(error.code).toBe('conflicting_file_path');
}
{
const files = ['api/[endpoint].js', 'api/[endpoint]/[id].js'];
const { builders } = await detectBuilders(files);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(defaultRoutes.length).toBe(2);
expect(defaultRoutes.length).toBe(3);
}
{
@@ -500,9 +511,11 @@ it('Test `detectRoutes`', async () => {
const { builders } = await detectBuilders(files);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(defaultRoutes[2].src).toBe('/(.*)');
expect(defaultRoutes[2].dest).toBe('/public/$1');
expect(defaultRoutes.length).toBe(3);
expect(defaultRoutes[2].status).toBe(404);
expect(defaultRoutes[2].src).toBe('/api(\\/.*)?$');
expect(defaultRoutes[3].src).toBe('/(.*)');
expect(defaultRoutes[3].dest).toBe('/public/$1');
expect(defaultRoutes.length).toBe(4);
}
{
@@ -514,6 +527,63 @@ it('Test `detectRoutes`', async () => {
const { builders } = await detectBuilders(files, pkg);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(defaultRoutes[1].status).toBe(404);
expect(defaultRoutes[1].src).toBe('/api(\\/.*)?$');
expect(defaultRoutes.length).toBe(2);
}
{
const files = ['public/index.html'];
const { builders } = await detectBuilders(files);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(defaultRoutes.length).toBe(1);
}
{
const files = ['api/date/index.js', 'api/date.js'];
const { builders } = await detectBuilders(files);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(defaultRoutes.length).toBe(3);
expect(defaultRoutes[0].src).toBe('^/api/date/(index|index\\.js)?$');
expect(defaultRoutes[0].dest).toBe('/api/date/index.js');
expect(defaultRoutes[1].src).toBe('^/api/(date|date\\.js)$');
expect(defaultRoutes[1].dest).toBe('/api/date.js');
}
{
const files = ['api/date.js', 'api/[date]/index.js'];
const { builders } = await detectBuilders(files);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(defaultRoutes.length).toBe(3);
expect(defaultRoutes[0].src).toBe('^/api/([^\\/]+)/(index|index\\.js)?$');
expect(defaultRoutes[0].dest).toBe('/api/[date]/index.js?date=$1');
expect(defaultRoutes[1].src).toBe('^/api/(date|date\\.js)$');
expect(defaultRoutes[1].dest).toBe('/api/date.js');
}
{
const files = [
'api/index.ts',
'api/index.d.ts',
'api/users/index.ts',
'api/users/index.d.ts',
'api/food.ts',
'api/ts/gold.ts',
];
const { builders } = await detectBuilders(files);
const { defaultRoutes } = await detectRoutes(files, builders);
expect(builders.length).toBe(4);
expect(builders[0].use).toBe('@now/node');
expect(builders[1].use).toBe('@now/node');
expect(builders[2].use).toBe('@now/node');
expect(builders[3].use).toBe('@now/node');
expect(defaultRoutes.length).toBe(5);
}
});

View File

@@ -1,4 +1,5 @@
node_modules
dist
*.log
/go
/analyze

View File

@@ -1,5 +0,0 @@
*.ts
test
tsconfig.json
package-lock.json
yarn.lock

2
packages/now-go/build.sh Executable file
View File

@@ -0,0 +1,2 @@
ncc build index.ts -o dist
ncc build install.ts -o dist/install

View File

@@ -1,7 +1,8 @@
{
"name": "@now/go",
"version": "0.5.7",
"version": "0.5.8",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/go-now-go",
"repository": {
"type": "git",
@@ -9,30 +10,25 @@
"directory": "packages/now-go"
},
"scripts": {
"build": "tsc",
"test": "tsc && jest",
"prepublish": "tsc",
"now-postinstall": "node install.js"
"build": "./build.sh",
"test": "./build.sh && jest",
"prepublish": "./build.sh",
"now-postinstall": "node dist/install/index.js"
},
"files": [
"*.js",
"main.go",
"main__mod__.go",
"util"
"dist"
],
"dependencies": {
"debug": "^4.1.1",
"execa": "^1.0.0",
"fs-extra": "^7.0.0",
"node-fetch": "^2.2.1",
"tar": "4.4.6"
},
"devDependencies": {
"@types/debug": "^4.1.3",
"@types/execa": "^0.9.0",
"@types/fs-extra": "^5.0.5",
"@types/node-fetch": "^2.3.0",
"@types/tar": "^4.0.0",
"debug": "^4.1.1",
"execa": "^1.0.0",
"fs-extra": "^7.0.0",
"node-fetch": "^2.2.1",
"tar": "4.4.6",
"typescript": "3.5.2"
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@now/next",
"version": "0.5.8",
"version": "0.5.9",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/next-js-now-next",

View File

@@ -156,6 +156,7 @@ export const build = async ({
files,
workPath,
entrypoint,
config,
meta = {} as BuildParamsMeta,
}: BuildParamsType): Promise<{
routes: Route[];
@@ -179,7 +180,7 @@ export const build = async ({
await createServerlessConfig(workPath);
}
const nodeVersion = await getNodeVersion(entryPath);
const nodeVersion = await getNodeVersion(entryPath, undefined, config);
const spawnOpts = getSpawnOptions(meta, nodeVersion);
if (!nextVersion) {

View File

@@ -1,6 +1,6 @@
{
"name": "@now/node",
"version": "0.12.3",
"version": "0.12.4",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/node-js-now-node",

View File

@@ -16,6 +16,7 @@ import {
PrepareCacheOptions,
BuildOptions,
shouldServe,
Config,
} from '@now/build-utils';
export { NowRequest, NowResponse } from './types';
import { makeLauncher } from './launcher';
@@ -32,6 +33,7 @@ interface DownloadOptions {
files: Files;
entrypoint: string;
workPath: string;
config: Config;
meta: Meta;
}
@@ -53,6 +55,7 @@ async function downloadInstallAndBundle({
files,
entrypoint,
workPath,
config,
meta,
}: DownloadOptions) {
console.log('downloading user files...');
@@ -63,7 +66,11 @@ async function downloadInstallAndBundle({
console.log("installing dependencies for user's code...");
const installTime = Date.now();
const entrypointFsDirname = join(workPath, dirname(entrypoint));
const nodeVersion = await getNodeVersion(entrypointFsDirname);
const nodeVersion = await getNodeVersion(
entrypointFsDirname,
undefined,
config
);
const spawnOpts = getSpawnOptions(meta, nodeVersion);
await runNpmInstall(entrypointFsDirname, ['--prefer-offline'], spawnOpts);
console.log(`install complete [${Date.now() - installTime}ms]`);
@@ -297,6 +304,7 @@ export async function build({
files,
entrypoint,
workPath,
config,
meta,
});

View File

@@ -8,7 +8,7 @@ import _ts from 'typescript';
*/
/**
* Debugging `ts-node`.
* Debugging.
*/
const shouldDebug = false;
const debug = shouldDebug

View File

@@ -1 +0,0 @@
/build

View File

@@ -1,20 +0,0 @@
FROM library/centos:6.8
RUN yum -y install wget git
RUN rpm -Uvh https://mirror.webtatic.com/yum/el6/latest.rpm
RUN yum -y install php71w-cli php71w-fpm php71w-mbstring php71w-mysql php71w-opcache
RUN yum -y install epel-release
RUN yum -y install patchelf
RUN mkdir -p /root/app/public
WORKDIR /root/app
COPY ./php.ini /root/app/php.ini
COPY ./php-fpm.ini /root/app/php-fpm.ini
COPY ./test.php /root/app/test.php
COPY ./test.sh /root/app/test.sh
RUN patchelf --set-rpath '$ORIGIN' /usr/bin/php
RUN patchelf --set-rpath '$ORIGIN' /usr/sbin/php-fpm
RUN patchelf --set-rpath '$ORIGIN' /usr/lib64/php/modules/mysqli.so
CMD ["/bin/bash", "test.sh"]

View File

@@ -1,15 +0,0 @@
rm -rf ../native
mkdir -p ../native/modules
docker rmi now-php-docker-image --force
docker build . -t now-php-docker-image
docker run now-php-docker-image
docker run now-php-docker-image /bin/cat /usr/sbin/php-fpm > ../native/php-fpm
docker run now-php-docker-image /bin/cat /root/app/php.ini > ../native/php.ini
docker run now-php-docker-image /bin/cat /root/app/php-fpm.ini > ../native/php-fpm.ini
docker run now-php-docker-image /bin/cat /usr/lib64/php/modules/curl.so > ../native/modules/curl.so
docker run now-php-docker-image /bin/cat /usr/lib64/php/modules/json.so > ../native/modules/json.so
docker run now-php-docker-image /bin/cat /usr/lib64/php/modules/mbstring.so > ../native/modules/mbstring.so
docker run now-php-docker-image /bin/cat /usr/lib64/php/modules/mysqli.so > ../native/modules/mysqli.so
docker run now-php-docker-image /bin/cat /usr/lib64/mysql/libmysqlclient.so.16 > ../native/modules/libmysqlclient.so.16
docker run now-php-docker-image /bin/cat /usr/lib64/php/modules/opcache.so > ../native/modules/opcache.so
chmod +x ../native/php-fpm

View File

@@ -1,9 +0,0 @@
[global]
error_log=/tmp/fpm-error.log
[www]
listen=0.0.0.0:9000
pm=static
pm.max_children=1
catch_workers_output=yes
clear_env=no

View File

@@ -1,8 +0,0 @@
extension=/root/app/modules/curl.so
extension=/root/app/modules/json.so
extension=/root/app/modules/mbstring.so
extension=/root/app/modules/mysqli.so
zend_extension=/root/app/modules/opcache.so
opcache.enable_cli=1
mysqli.max_links=10
mysqli.max_persistent=10

View File

@@ -1,4 +0,0 @@
<?php
mysqli_connect();
print('php_sapi_name=' . php_sapi_name() . PHP_EOL);
print('opcache_enabled=' . opcache_get_status()['opcache_enabled'] . PHP_EOL);

View File

@@ -1,18 +0,0 @@
mkdir -p /root/app/modules
cp /usr/bin/php /root/app/php
cp /usr/sbin/php-fpm /root/app/php-fpm
cp /usr/lib64/php/modules/curl.so /root/app/modules/curl.so
cp /usr/lib64/php/modules/json.so /root/app/modules/json.so
cp /usr/lib64/php/modules/mbstring.so /root/app/modules/mbstring.so
cp /usr/lib64/php/modules/mysqli.so /root/app/modules/mysqli.so
cp /usr/lib64/mysql/libmysqlclient.so.16 /root/app/modules/libmysqlclient.so.16
cp /usr/lib64/php/modules/opcache.so /root/app/modules/opcache.so
rm -rf $(which php)
rm -rf $(which php-fpm)
rm -rf /usr/lib64/php
rm -rf /usr/lib64/mysql
rm -rf /etc/php.d
./php-fpm --help
./php -c php.ini test.php
echo "if you see 'can't connect to local mysql' - it is good - mysql library is found and used"
echo "if you see 'call to undefined function mysqli_connect' - it is bad - something went wrong"

View File

@@ -1,149 +0,0 @@
/* eslint-disable no-bitwise,no-use-before-define */
const assert = require('assert');
const { freeParser } = require('_http_common');
const { spawn } = require('child_process');
const createConnection = require('./connection.js');
const { MSG_TYPE, PROTOCOL_STATUS } = require('./consts.js');
const { whenPortOpens } = require('./port.js');
const { HTTPParser } = process.binding('http_parser');
const BEGIN_REQUEST_DATA_KEEP_CONN = Buffer.from('\0\x01\x01\0\0\0\0\0'); // FCGI_ROLE_RESPONDER && FCGI_KEEP_CONN
const MESSAGE_FCGI_STDOUT = `message-${MSG_TYPE.FCGI_STDOUT}`;
const MESSAGE_FCGI_STDERR = `message-${MSG_TYPE.FCGI_STDERR}`;
const MESSAGE_FCGI_END_REQUEST = `message-${MSG_TYPE.FCGI_END_REQUEST}`;
let curReqId = 0;
let connection;
async function startPhp() {
assert(!connection);
const child = spawn(
'./php-fpm',
['-c', 'php.ini', '--fpm-config', 'php-fpm.ini', '--nodaemonize'],
{
stdio: 'inherit',
cwd: '/var/task/native',
},
);
child.on('exit', () => {
console.error('php exited');
process.exit(1);
});
child.on('error', (error) => {
console.error(error);
process.exit(1);
});
await whenPortOpens(9000, 400);
const newConnection = createConnection({
_host: '127.0.0.1',
_port: 9000,
});
await new Promise((resolve, reject) => {
function onError() {
cleanup();
reject();
}
function onConnect() {
connection = newConnection;
cleanup();
resolve();
}
newConnection.on('error', onError);
newConnection.on('connect', onConnect);
function cleanup() {
newConnection.removeListener('error', onError);
newConnection.removeListener('connect', onConnect);
}
});
}
async function query({ params, stdin }) {
if (!connection) {
await startPhp();
}
return new Promise((resolve) => {
assert(connection);
const chunks = [Buffer.from('HTTP/1.1 200 MAKES-PARSER-WORK\n')];
function onError(error) {
console.error(error);
process.exit(1);
}
function onStdout(reqId, data) {
chunks.push(data);
}
function onStderr(reqId, data) {
console.error(data.toString().trim());
}
function onEndRequest(reqId, data) {
const protocolStatus = data.readUInt8(4, true);
if (protocolStatus !== PROTOCOL_STATUS.FCGI_REQUEST_COMPLETE) {
console.error('protocolStatus', protocolStatus);
process.exit(1);
}
const response = Buffer.concat(chunks);
const parser = new HTTPParser(HTTPParser.RESPONSE);
let tuples = [];
parser[HTTPParser.kOnHeadersComplete | 0] = (major, minor, t) => {
tuples = t;
};
let body;
parser[HTTPParser.kOnBody | 0] = (b, start, len) => {
body = b.slice(start, start + len);
};
parser.execute(response);
freeParser(parser);
cleanup();
resolve({ tuples, body });
}
connection.on('error', onError);
connection.on(MESSAGE_FCGI_STDOUT, onStdout);
connection.on(MESSAGE_FCGI_STDERR, onStderr);
connection.on(MESSAGE_FCGI_END_REQUEST, onEndRequest);
function cleanup() {
connection.removeListener('error', onError);
connection.removeListener(MESSAGE_FCGI_STDOUT, onStdout);
connection.removeListener(MESSAGE_FCGI_STDERR, onStderr);
connection.removeListener(MESSAGE_FCGI_END_REQUEST, onEndRequest);
}
curReqId += 1;
// TODO these things have callbacks. what to do with them?
connection.send(
MSG_TYPE.FCGI_BEGIN_REQUEST,
curReqId,
BEGIN_REQUEST_DATA_KEEP_CONN,
);
connection.send(MSG_TYPE.FCGI_PARAMS, curReqId, params);
connection.send(MSG_TYPE.FCGI_PARAMS, curReqId, null);
if (stdin) connection.send(MSG_TYPE.FCGI_STDIN, curReqId, stdin);
connection.send(MSG_TYPE.FCGI_STDIN, curReqId, null);
});
}
module.exports = {
query,
};
/*
(async function() {
console.log(await query({ params: {
REQUEST_METHOD: 'GET', SCRIPT_FILENAME: '/phpinfo.php'
} }));
})();
*/

View File

@@ -1,41 +0,0 @@
/* eslint-disable consistent-return */
const net = require('net');
function whenPortOpensCallback(port, attempts, cb) {
const client = net.connect(port, '127.0.0.1');
client.on('error', (error) => {
if (!attempts) return cb(error);
setTimeout(() => {
whenPortOpensCallback(port, attempts - 1, cb);
}, 50);
});
client.on('connect', () => {
client.destroy();
cb();
});
}
function isPortOpen(port) {
return new Promise((resolve) => {
whenPortOpensCallback(port, 0, (error) => {
if (error) return resolve(false);
resolve(true);
});
});
}
function whenPortOpens(port, attempts) {
return new Promise((resolve, reject) => {
whenPortOpensCallback(port, attempts, (error) => {
if (error) return reject(error);
resolve();
});
});
}
module.exports = {
isPortOpen,
whenPortOpensCallback,
whenPortOpens,
};

View File

@@ -1,43 +0,0 @@
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
const path = require('path');
async function getFiles() {
const files = await glob('native/**', __dirname);
const phpConfig = await FileBlob.fromStream({
stream: files['native/php.ini'].toStream(),
});
phpConfig.data = phpConfig.data
.toString()
.replace(/\/root\/app\/modules/g, '/var/task/native/modules');
files['native/php.ini'] = phpConfig;
Object.assign(files, {
'fastcgi/connection.js': new FileFsRef({
fsPath: require.resolve('fastcgi-client/lib/connection.js'),
}),
'fastcgi/consts.js': new FileFsRef({
fsPath: require.resolve('fastcgi-client/lib/consts.js'),
}),
'fastcgi/stringifykv.js': new FileFsRef({
fsPath: require.resolve('fastcgi-client/lib/stringifykv.js'),
}),
'fastcgi/index.js': new FileFsRef({
fsPath: path.join(__dirname, 'fastcgi/index.js'),
}),
'fastcgi/port.js': new FileFsRef({
fsPath: path.join(__dirname, 'fastcgi/port.js'),
}),
'launcher.js': new FileFsRef({
fsPath: path.join(__dirname, 'launcher.js'),
}),
});
return files;
}
module.exports = {
getFiles,
};

View File

@@ -1,150 +0,0 @@
/* eslint-disable prefer-template */
const assert = require('assert');
const fs = require('fs');
const { join: pathJoin } = require('path');
const { parse: parseUrl } = require('url');
const { query } = require('./fastcgi/index.js');
function normalizeEvent(event) {
if (event.Action === 'Invoke') {
const invokeEvent = JSON.parse(event.body);
const {
method, path, headers, encoding,
} = invokeEvent;
let { body } = invokeEvent;
if (body) {
if (encoding === 'base64') {
body = Buffer.from(body, encoding);
} else if (encoding === undefined) {
body = Buffer.from(body);
} else {
throw new Error(`Unsupported encoding: ${encoding}`);
}
}
return {
method,
path,
headers,
body,
};
}
const {
httpMethod: method, path, headers, body,
} = event;
return {
method,
path,
headers,
body,
};
}
function isDirectory(p) {
return new Promise((resolve) => {
fs.stat(p, (error, s) => {
if (error) {
resolve(false);
return;
}
if (s.isDirectory()) {
resolve(true);
return;
}
resolve(false);
});
});
}
async function transformFromAwsRequest({
method, path, headers, body,
}) {
const { pathname, search, query: queryString } = parseUrl(path);
let requestUri = pathname + (search || '');
let filename = pathJoin(
'/var/task/user',
process.env.NOW_ENTRYPOINT || pathname,
);
if (await isDirectory(filename)) {
if (!filename.endsWith('/')) {
filename += '/';
requestUri = pathname + '/' + (search || '');
}
filename += 'index.php';
}
const params = {};
params.REQUEST_METHOD = method;
params.REQUEST_URI = requestUri;
params.QUERY_STRING = queryString || ''; // can be null
params.SCRIPT_FILENAME = filename;
params.SERVER_PROTOCOL = 'HTTP/1.1';
params.SERVER_PORT = 443;
params.HTTPS = 'on';
// eslint-disable-next-line no-restricted-syntax
for (const [k, v] of Object.entries(headers)) {
const camel = k.toUpperCase().replace(/-/g, '_');
params[`HTTP_${camel}`] = v;
if (camel === 'HOST') {
params.SERVER_NAME = v;
} else if (['CONTENT_TYPE', 'CONTENT_LENGTH'].includes(camel)) {
params[camel] = v; // without HOST_ prepended
}
}
return { params, stdin: body };
}
function transformToAwsResponse({ tuples, body }) {
let statusCode = 200;
const headers = {};
// eslint-disable-next-line no-param-reassign
if (!body) body = Buffer.alloc(0);
assert(Buffer.isBuffer(body));
for (let i = 0; i < tuples.length; i += 2) {
const k = tuples[i].toLowerCase();
const v = tuples[i + 1];
if (k === 'status') {
statusCode = Number(v.split(' ')[0]); // '408 Request Timeout'
} else {
if (!headers[k]) headers[k] = [];
headers[k].push(v);
}
}
return {
statusCode,
headers,
body: body.toString('base64'),
encoding: 'base64',
};
}
async function launcher(event) {
const awsRequest = normalizeEvent(event);
const input = await transformFromAwsRequest(awsRequest);
const output = await query(input);
return transformToAwsResponse(output);
}
exports.launcher = launcher;
/*
(async function() {
console.log(await launcher({
httpMethod: 'GET',
path: '/phpinfo.php'
}));
})();
*/

View File

@@ -1,9 +0,0 @@
[global]
error_log=/tmp/fpm-error.log
[www]
listen=0.0.0.0:9000
pm=static
pm.max_children=1
catch_workers_output=yes
clear_env=no

View File

@@ -1,8 +0,0 @@
extension=/root/app/modules/curl.so
extension=/root/app/modules/json.so
extension=/root/app/modules/mbstring.so
extension=/root/app/modules/mysqli.so
zend_extension=/root/app/modules/opcache.so
opcache.enable_cli=1
mysqli.max_links=10
mysqli.max_persistent=10

View File

@@ -1,13 +0,0 @@
{
"name": "@now/php-bridge",
"version": "0.5.3",
"license": "MIT",
"repository": {
"type": "git",
"url": "https://github.com/zeit/now-builders.git",
"directory": "packages/now-php-bridge"
},
"dependencies": {
"fastcgi-client": "0.0.1"
}
}

View File

@@ -1 +0,0 @@
/test

View File

@@ -1,55 +0,0 @@
const {
createLambda,
rename,
glob,
download,
shouldServe,
} = require('@now/build-utils'); // eslint-disable-line import/no-extraneous-dependencies
const path = require('path');
const { getFiles } = require('@now/php-bridge');
exports.build = async ({
files, entrypoint, workPath, config, meta,
}) => {
// Download all files to workPath
const downloadedFiles = await download(files, workPath, meta);
let includedFiles = {};
if (config && config.includeFiles) {
// Find files for each glob
// eslint-disable-next-line no-restricted-syntax
for (const pattern of config.includeFiles) {
// eslint-disable-next-line no-await-in-loop
const matchedFiles = await glob(pattern, workPath);
Object.assign(includedFiles, matchedFiles);
}
// explicit and always include the entrypoint
Object.assign(includedFiles, {
[entrypoint]: files[entrypoint],
});
} else {
// Backwards compatibility
includedFiles = downloadedFiles;
}
console.log('Included files:', Object.keys(includedFiles));
const userFiles = rename(includedFiles, name => path.join('user', name));
const bridgeFiles = await getFiles();
// TODO config.extensions. OR php.ini from user
delete bridgeFiles['native/modules/mysqli.so'];
delete bridgeFiles['native/modules/libmysqlclient.so.16'];
const lambda = await createLambda({
files: { ...userFiles, ...bridgeFiles },
handler: 'launcher.launcher',
runtime: 'nodejs8.10',
environment: {
NOW_ENTRYPOINT: entrypoint,
},
});
return { [entrypoint]: lambda };
};
exports.shouldServe = shouldServe;

View File

@@ -1,17 +0,0 @@
{
"name": "@now/php",
"version": "0.5.7",
"license": "MIT",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/php-now-php",
"repository": {
"type": "git",
"url": "https://github.com/zeit/now-builders.git",
"directory": "packages/now-php"
},
"dependencies": {
"@now/php-bridge": "^0.5.3"
},
"scripts": {
"test": "jest"
}
}

View File

@@ -1,2 +0,0 @@
<?php
print('cow:RANDOMNESS_PLACEHOLDER');

View File

@@ -1,11 +0,0 @@
{
"version": 2,
"builds": [
{ "src": "index.php", "use": "@now/php" },
{ "src": "subdirectory/index.php", "use": "@now/php" }
],
"probes": [
{ "path": "/", "mustContain": "cow:RANDOMNESS_PLACEHOLDER" },
{ "path": "/subdirectory/", "mustContain": "yoda:RANDOMNESS_PLACEHOLDER" }
]
}

View File

@@ -1,2 +0,0 @@
<?php
print('yoda:RANDOMNESS_PLACEHOLDER');

View File

@@ -1,2 +0,0 @@
<?php
print($_ENV['RANDOMNESS_ENV_VAR'] . ':env');

View File

@@ -1,5 +0,0 @@
{
"version": 2,
"builds": [{ "src": "env/index.php", "use": "@now/php" }],
"probes": [{ "path": "/env", "mustContain": "RANDOMNESS_PLACEHOLDER:env" }]
}

View File

@@ -1,3 +0,0 @@
<?php
echo 'Excluded!';

View File

@@ -1,3 +0,0 @@
<?php
echo 'included:RANDOMNESS_PLACEHOLDER';

View File

@@ -1,9 +0,0 @@
<?php
echo 'mainfile:';
if (file_exists('included_file.php') && !file_exists('excluded_file.php')) {
require_once 'included_file.php';
} else {
echo PHP_EOL;
print_r(array_diff(scandir('.'), array('..', '.')));
}

View File

@@ -1,13 +0,0 @@
{
"version": 2,
"builds": [
{
"src": "index.php",
"use": "@now/php",
"config": { "includeFiles": ["included*.php"] }
}
],
"probes": [
{ "path": "/", "mustContain": "mainfile:included:RANDOMNESS_PLACEHOLDER" }
]
}

View File

@@ -1,28 +0,0 @@
<?php
header('Content-Type: text/plain');
print($_SERVER['SCRIPT_FILENAME'] . PHP_EOL);
print($_SERVER['REQUEST_METHOD'] . PHP_EOL);
print($_SERVER['REQUEST_URI'] . PHP_EOL);
print($_SERVER['HTTP_HOST'] . PHP_EOL);
print($_SERVER['HTTP_X_SOME_HEADER'] . PHP_EOL);
print($_SERVER['SERVER_PROTOCOL'] . PHP_EOL);
print($_SERVER['SERVER_NAME'] . PHP_EOL);
print($_SERVER['SERVER_PORT'] . PHP_EOL);
print($_SERVER['HTTPS'] . PHP_EOL);
print($_GET['get1'] . PHP_EOL);
var_dump($_GET['get2']);
print($_POST['post1'] . PHP_EOL);
var_dump($_POST['post2']);
print($_COOKIE['cookie1'] . PHP_EOL);
var_dump($_COOKIE['cookie2']);
print($_REQUEST['get1'] . PHP_EOL);
var_dump($_REQUEST['get2']);
print($_REQUEST['post1'] . PHP_EOL);
var_dump($_REQUEST['post2']);
print($_REQUEST['cookie1'] . PHP_EOL);
var_dump($_REQUEST['cookie2']);
print(file_get_contents('php://input') . PHP_EOL);
print('end' . PHP_EOL);

View File

@@ -1,4 +0,0 @@
{
"version": 2,
"builds": [{ "src": "index.php", "use": "@now/php" }]
}

View File

@@ -1,156 +0,0 @@
const assert = require('assert');
async function test1({ deploymentUrl, fetch }) {
const resp = await fetch(
`https://${deploymentUrl}/index.php?get1=foo&get1=bar&get2[]=bim&get2[]=bom`,
{
headers: {
'X-Some-Header': 'x-some-header-value',
},
},
);
assert(resp.status === 200);
const text = await resp.text();
const lines = text.trim().split('\n');
assert.deepEqual(lines, [
'/var/task/user/index.php',
'GET',
'/index.php?get1=foo&get1=bar&get2%5B%5D=bim&get2%5B%5D=bom', // TODO fake news, must be unescaped
deploymentUrl, // example 'test-19phw91ph.now.sh'
'x-some-header-value',
'HTTP/1.1',
deploymentUrl, // example 'test-19phw91ph.now.sh'
'443',
'on',
'bar',
'array(2) {',
' [0]=>',
' string(3) "bim"',
' [1]=>',
' string(3) "bom"',
'}',
'',
'NULL',
'',
'NULL',
'bar',
'array(2) {',
' [0]=>',
' string(3) "bim"',
' [1]=>',
' string(3) "bom"',
'}',
'',
'NULL',
'',
'NULL',
'',
'end',
]);
}
async function test2({ deploymentUrl, fetch }) {
const resp = await fetch(`https://${deploymentUrl}/index.php`, {
method: 'POST',
body: 'post1=baz&post1=bat&post2[]=pim&post2[]=pom',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
},
});
assert(resp.status === 200);
const text = await resp.text();
const lines = text.trim().split('\n');
assert.deepEqual(lines, [
'/var/task/user/index.php',
'POST',
'/index.php',
deploymentUrl, // example 'test-19phw91ph.now.sh'
'',
'HTTP/1.1',
deploymentUrl, // example 'test-19phw91ph.now.sh'
'443',
'on',
'',
'NULL',
'bat',
'array(2) {',
' [0]=>',
' string(3) "pim"',
' [1]=>',
' string(3) "pom"',
'}',
'',
'NULL',
'',
'NULL',
'bat',
'array(2) {',
' [0]=>',
' string(3) "pim"',
' [1]=>',
' string(3) "pom"',
'}',
'',
'NULL',
'post1=baz&post1=bat&post2[]=pim&post2[]=pom',
'end',
]);
}
async function test3({ deploymentUrl, fetch }) {
const resp = await fetch(`https://${deploymentUrl}/index.php`, {
method: 'GET',
headers: {
Cookie: `cookie1=foo; cookie1=${escape('bar|bar')}; ${escape(
'cookie2[]',
)}=dim; ${escape('cookie2[]')}=${escape('dom|dom')}`,
},
});
assert(resp.status === 200);
const text = await resp.text();
const lines = text.trim().split('\n');
assert.deepEqual(lines, [
'/var/task/user/index.php',
'GET',
'/index.php',
deploymentUrl, // example 'test-19phw91ph.now.sh'
'',
'HTTP/1.1',
deploymentUrl, // example 'test-19phw91ph.now.sh'
'443',
'on',
'',
'NULL',
'',
'NULL',
'foo',
'array(2) {',
' [0]=>',
' string(3) "dim"',
' [1]=>',
' string(7) "dom|dom"',
'}',
'',
'NULL',
'',
'NULL',
'foo',
'array(2) {',
' [0]=>',
' string(3) "dim"',
' [1]=>',
' string(7) "dom|dom"',
'}',
'',
'end',
]);
}
module.exports = async (opts) => {
await test1(opts);
await test2(opts);
await test3(opts);
};

View File

@@ -1,5 +0,0 @@
<?php
header('Content-Type: text/plain');
header('Content-Type: text/plain; charset=UTF-16');
setcookie('cookie1', 'cookie1value');
setcookie('cookie2', 'cookie2value');

View File

@@ -1,4 +0,0 @@
{
"version": 2,
"builds": [{ "src": "index.php", "use": "@now/php" }]
}

View File

@@ -1,9 +0,0 @@
const assert = require('assert');
module.exports = async ({ deploymentUrl, fetch }) => {
const resp = await fetch(`https://${deploymentUrl}/index.php`);
assert(resp.status === 200);
assert.equal(resp.headers.get('content-type'), 'text/plain; charset=UTF-16');
assert(resp.headers.get('set-cookie').includes('cookie1=cookie1value'));
assert(resp.headers.get('set-cookie').includes('cookie2=cookie2value'));
};

View File

@@ -1,10 +0,0 @@
<?php
// regression test for go-php engine reusage. on failure prints
// Fatal error: Cannot redeclare some_function() (previously declared in /var/task/user/index.php:7)
function some_function() {
print("paskantamasaari");
}
some_function();

View File

@@ -1,4 +0,0 @@
{
"version": 2,
"builds": [{ "src": "index.php", "use": "@now/php" }]
}

View File

@@ -1,8 +0,0 @@
const assert = require('assert');
module.exports = async ({ deploymentUrl, fetch }) => {
const resp1 = await fetch(`https://${deploymentUrl}/index.php`);
assert.equal(await resp1.text(), 'paskantamasaari');
const resp2 = await fetch(`https://${deploymentUrl}/index.php`);
assert.equal(await resp2.text(), 'paskantamasaari');
};

View File

@@ -1,2 +0,0 @@
<?php
var_dump(opcache_compile_file(__FILE__));

View File

@@ -1,4 +0,0 @@
{
"version": 2,
"builds": [{ "src": "index.php", "use": "@now/php" }]
}

View File

@@ -1,6 +0,0 @@
const assert = require('assert');
module.exports = async ({ deploymentUrl, fetch }) => {
const resp = await fetch(`https://${deploymentUrl}/index.php`);
assert.equal(await resp.text(), 'bool(true)\n');
};

View File

@@ -1,2 +0,0 @@
<?php
print('cow:RANDOMNESS_PLACEHOLDER:' . $_SERVER['REQUEST_URI']);

View File

@@ -1,12 +0,0 @@
{
"version": 2,
"builds": [{ "src": "index.php", "use": "@now/php" }],
"routes": [{ "src": "/(.*)", "dest": "index.php" }],
"probes": [
{ "path": "/any", "mustContain": "cow:RANDOMNESS_PLACEHOLDER:/any" },
{
"path": "/any?type=some",
"mustContain": "cow:RANDOMNESS_PLACEHOLDER:/any?type=some"
}
]
}

View File

@@ -1,33 +0,0 @@
/* global beforeAll, expect, it, jest */
const fs = require('fs');
const path = require('path');
const {
packAndDeploy,
testDeployment,
} = require('../../../test/lib/deployment/test-deployment.js');
jest.setTimeout(4 * 60 * 1000);
const buildUtilsUrl = '@canary';
let builderUrl;
beforeAll(async () => {
const builderPath = path.resolve(__dirname, '..');
builderUrl = await packAndDeploy(builderPath);
console.log('builderUrl', builderUrl);
});
const fixturesPath = path.resolve(__dirname, 'fixtures');
// eslint-disable-next-line no-restricted-syntax
for (const fixture of fs.readdirSync(fixturesPath)) {
// eslint-disable-next-line no-loop-func
it(`should build ${fixture}`, async () => {
await expect(
testDeployment(
{ builderUrl, buildUtilsUrl },
path.join(fixturesPath, fixture),
),
).resolves.toBeDefined();
});
}

1
packages/now-python/build.sh Executable file
View File

@@ -0,0 +1 @@
ncc build src/index.ts -o dist

View File

@@ -1,6 +1,6 @@
{
"name": "@now/python",
"version": "0.2.13",
"version": "0.2.14",
"main": "./dist/index.js",
"license": "MIT",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/python-now-python",
@@ -14,16 +14,13 @@
"directory": "packages/now-python"
},
"scripts": {
"build": "tsc",
"test": "tsc && jest",
"prepublishOnly": "tsc"
},
"dependencies": {
"execa": "^1.0.0",
"node-fetch": "^2.2.0"
"build": "./build.sh",
"test": "./build.sh && jest",
"prepublishOnly": "./build.sh"
},
"devDependencies": {
"@types/execa": "^0.9.0",
"execa": "^1.0.0",
"typescript": "3.5.2"
}
}

1
packages/now-ruby/build.sh Executable file
View File

@@ -0,0 +1 @@
ncc build index.ts -o dist

View File

@@ -1,7 +1,7 @@
{
"name": "@now/ruby",
"author": "Nathan Cahill <nathan@nathancahill.com>",
"version": "0.1.3",
"version": "0.1.4",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/ruby-now-ruby",
@@ -15,15 +15,13 @@
"directory": "packages/now-ruby"
},
"scripts": {
"build": "tsc",
"test": "tsc && jest",
"prepublishOnly": "tsc"
},
"dependencies": {
"execa": "^1.0.0",
"fs-extra": "^7.0.1"
"build": "./build.sh",
"test": "./build.sh && jest",
"prepublishOnly": "./build.sh"
},
"devDependencies": {
"execa": "^1.0.0",
"fs-extra": "^7.0.1",
"typescript": "3.5.2"
}
}

View File

@@ -1 +0,0 @@
*.js

File diff suppressed because it is too large Load Diff

View File

@@ -1,25 +0,0 @@
[package]
name = "now_lambda"
version = "0.1.3"
authors = ["Antonio Nuno Monteiro <anmonteiro@gmail.com>"]
edition = "2018"
description = "Rust bindings for Now.sh Lambdas"
keywords = ["AWS", "Lambda", "Zeit", "Now", "Rust"]
license = "MIT"
homepage = "https://github.com/zeit/now-builders"
repository = "https://github.com/zeit/now-builders"
documentation = "https://docs.rs/now_lambda"
include = [
"src/*.rs",
"Cargo.toml"
]
[dependencies]
serde = "^1"
serde_json = "^1"
serde_derive = "^1"
http = "0.1"
tokio = "^0.1"
base64 = "0.10"
log = "^0.4"
lambda_runtime = "0.2.0"

View File

@@ -1,8 +0,0 @@
use now_lambda::lambda;
use std::error::Error;
// PLACEHOLDER
fn main() -> Result<(), Box<dyn Error>> {
Ok(lambda!(handler))
}

View File

@@ -1,33 +0,0 @@
{
"name": "@now/rust",
"version": "0.2.10",
"license": "MIT",
"main": "./dist/index",
"homepage": "https://zeit.co/docs/v2/deployments/official-builders/now-rust",
"repository": {
"type": "git",
"url": "https://github.com/zeit/now-builders.git",
"directory": "packages/now-rust"
},
"scripts": {
"build": "tsc",
"test": "tsc && jest",
"prepublish": "tsc",
"typecheck": "tsc --noEmit",
"now-postinstall": "node dist/install.js"
},
"files": [
"dist",
"launcher.rs"
],
"dependencies": {
"@iarna/toml": "^2.2.1",
"execa": "^1.0.0",
"fs-extra": "^7.0.1"
},
"devDependencies": {
"@types/execa": "^0.9.0",
"@types/fs-extra": "^7.0.0",
"typescript": "3.5.2"
}
}

View File

@@ -1,196 +0,0 @@
//! Provides a Now Lambda oriented request and response body entity interface
use std::{borrow::Cow, ops::Deref, str};
use base64::display::Base64Display;
use serde::ser::{Error as SerError, Serialize, Serializer};
/// Representation of http request and response bodies as supported
/// by Zeit Now v2.
///
/// These come in three flavors
/// * `Empty` ( no body )
/// * `Text` ( text data )
/// * `Binary` ( binary data )
///
/// Body types can be `Deref` and `AsRef`'d into `[u8]` types much like the `hyper` crate
///
/// # Examples
///
/// Body types are inferred with `From` implementations.
///
/// ## Text
///
/// Types like `String`, `str` whose type reflects
/// text produce `Body::Text` variants
///
/// ```
/// assert!(match now_lambda::Body::from("text") {
/// now_lambda::Body::Text(_) => true,
/// _ => false
/// })
/// ```
///
/// ## Binary
///
/// Types like `Vec<u8>` and `&[u8]` whose types reflect raw bytes produce `Body::Binary` variants
///
/// ```
/// assert!(match now_lambda::Body::from("text".as_bytes()) {
/// now_lambda::Body::Binary(_) => true,
/// _ => false
/// })
/// ```
///
/// `Binary` responses bodies will automatically get base64 encoded.
///
/// ## Empty
///
/// The unit type ( `()` ) whose type represents an empty value produces `Body::Empty` variants
///
/// ```
/// assert!(match now_lambda::Body::from(()) {
/// now_lambda::Body::Empty => true,
/// _ => false
/// })
/// ```
#[derive(Debug, PartialEq)]
pub enum Body {
/// An empty body
Empty,
/// A body containing string data
Text(String),
/// A body containing binary data
Binary(Vec<u8>),
}
impl Default for Body {
fn default() -> Self {
Body::Empty
}
}
impl From<()> for Body {
fn from(_: ()) -> Self {
Body::Empty
}
}
impl From<Body> for () {
fn from(_: Body) -> Self {
()
}
}
impl<'a> From<&'a str> for Body {
fn from(s: &'a str) -> Self {
Body::Text(s.into())
}
}
impl From<String> for Body {
fn from(b: String) -> Self {
Body::Text(b)
}
}
impl From<Body> for String {
fn from(b: Body) -> String {
match b {
Body::Empty => String::from(""),
Body::Text(t) => t,
Body::Binary(b) => str::from_utf8(&b).unwrap().to_owned(),
}
}
}
impl From<Cow<'static, str>> for Body {
#[inline]
fn from(cow: Cow<'static, str>) -> Body {
match cow {
Cow::Borrowed(b) => Body::from(b.to_owned()),
Cow::Owned(o) => Body::from(o),
}
}
}
impl From<Body> for Cow<'static, str> {
#[inline]
fn from(b: Body) -> Cow<'static, str> {
Cow::Owned(String::from(b))
}
}
impl From<Cow<'static, [u8]>> for Body {
#[inline]
fn from(cow: Cow<'static, [u8]>) -> Body {
match cow {
Cow::Borrowed(b) => Body::from(b),
Cow::Owned(o) => Body::from(o),
}
}
}
impl From<Body> for Cow<'static, [u8]> {
#[inline]
fn from(b: Body) -> Self {
Cow::Owned(b.as_ref().to_owned())
}
}
impl From<Vec<u8>> for Body {
fn from(b: Vec<u8>) -> Self {
Body::Binary(b)
}
}
impl From<Body> for Vec<u8> {
fn from(b: Body) -> Self {
match b {
Body::Empty => "".as_bytes().to_owned(),
Body::Text(t) => t.into_bytes(),
Body::Binary(b) => b.to_owned(),
}
}
}
impl<'a> From<&'a [u8]> for Body {
fn from(b: &'a [u8]) -> Self {
Body::Binary(b.to_vec())
}
}
impl Deref for Body {
type Target = [u8];
#[inline]
fn deref(&self) -> &Self::Target {
self.as_ref()
}
}
impl AsRef<[u8]> for Body {
#[inline]
fn as_ref(&self) -> &[u8] {
match self {
Body::Empty => &[],
Body::Text(ref bytes) => bytes.as_ref(),
Body::Binary(ref bytes) => bytes.as_ref(),
}
}
}
impl<'a> Serialize for Body {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
match self {
Body::Text(data) => serializer
.serialize_str(::std::str::from_utf8(data.as_ref()).map_err(S::Error::custom)?),
Body::Binary(data) => {
serializer.collect_str(&Base64Display::with_config(data, base64::STANDARD))
}
Body::Empty => serializer.serialize_unit(),
}
}
}

View File

@@ -1,44 +0,0 @@
use http;
use lambda_runtime::error::LambdaErrorExt;
use std::{error::Error, fmt};
/// This module implements a custom error currently over the AWS Lambda runtime,
/// which can be extended later to support more service providers.
#[derive(Debug)]
pub struct NowError {
msg: String,
}
impl NowError {
pub fn new(message: &str) -> NowError {
NowError {
msg: message.to_owned(),
}
}
}
impl fmt::Display for NowError {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "{}", self.msg)
}
}
impl Error for NowError {}
impl From<std::num::ParseIntError> for NowError {
fn from(i: std::num::ParseIntError) -> Self {
NowError::new(&format!("{}", i))
}
}
impl From<http::Error> for NowError {
fn from(i: http::Error) -> Self {
NowError::new(&format!("{}", i))
}
}
// the value returned by the error_type function is included as the
// `errorType` in the AWS Lambda response
impl LambdaErrorExt for NowError {
fn error_type(&self) -> &str {
"NowError"
}
}

View File

@@ -1,393 +0,0 @@
import fs from 'fs-extra';
import path from 'path';
import execa from 'execa';
import toml from '@iarna/toml';
import {
glob,
createLambda,
download,
FileRef,
FileFsRef,
runShellScript,
BuildOptions,
PrepareCacheOptions,
DownloadedFiles,
Lambda,
} from '@now/build-utils'; // eslint-disable-line import/no-extraneous-dependencies
interface PackageManifest {
targets: { kind: string; name: string }[];
}
interface CargoConfig {
env: Record<string, any>;
cwd: string;
}
interface CargoToml extends toml.JsonMap {
package: toml.JsonMap;
dependencies: toml.JsonMap;
}
const codegenFlags = [
'-C',
'target-cpu=ivybridge',
'-C',
'target-feature=-aes,-avx,+fxsr,-popcnt,+sse,+sse2,-sse3,-sse4.1,-sse4.2,-ssse3,-xsave,-xsaveopt',
];
async function inferCargoBinaries(config: CargoConfig) {
try {
const { stdout: manifestStr } = await execa(
'cargo',
['read-manifest'],
config
);
const { targets } = JSON.parse(manifestStr) as PackageManifest;
return targets
.filter(({ kind }) => kind.includes('bin'))
.map(({ name }) => name);
} catch (err) {
console.error('failed to run `cargo read-manifest`');
throw err;
}
}
async function parseTOMLStream(stream: NodeJS.ReadableStream) {
return toml.parse.stream(stream);
}
async function buildWholeProject(
{ entrypoint, config }: BuildOptions,
downloadedFiles: DownloadedFiles,
extraFiles: DownloadedFiles,
rustEnv: Record<string, string>
) {
const entrypointDirname = path.dirname(downloadedFiles[entrypoint].fsPath);
const { debug } = config;
console.log('running `cargo build`...');
try {
await execa(
'cargo',
['build', '--verbose'].concat(debug ? [] : ['--release']),
{
env: rustEnv,
cwd: entrypointDirname,
stdio: 'inherit',
}
);
} catch (err) {
console.error('failed to `cargo build`');
throw err;
}
const targetPath = path.join(
entrypointDirname,
'target',
debug ? 'debug' : 'release'
);
const binaries = await inferCargoBinaries({
env: rustEnv,
cwd: entrypointDirname,
});
const lambdas: Record<string, Lambda> = {};
const lambdaPath = path.dirname(entrypoint);
await Promise.all(
binaries.map(async binary => {
const fsPath = path.join(targetPath, binary);
const lambda = await createLambda({
files: {
...extraFiles,
bootstrap: new FileFsRef({ mode: 0o755, fsPath }),
},
handler: 'bootstrap',
runtime: 'provided',
});
lambdas[path.join(lambdaPath, binary)] = lambda;
})
);
return lambdas;
}
async function gatherExtraFiles(
globMatcher: string | string[] | undefined,
entrypoint: string
) {
if (!globMatcher) return {};
console.log('gathering extra files for the fs...');
const entryDir = path.dirname(entrypoint);
if (Array.isArray(globMatcher)) {
const allMatches = await Promise.all(
globMatcher.map(pattern => glob(pattern, entryDir))
);
return allMatches.reduce((acc, matches) => ({ ...acc, ...matches }), {});
}
return glob(globMatcher, entryDir);
}
async function runUserScripts(entrypoint: string) {
const entryDir = path.dirname(entrypoint);
const buildScriptPath = path.join(entryDir, 'build.sh');
const buildScriptExists = await fs.pathExists(buildScriptPath);
if (buildScriptExists) {
console.log('running `build.sh`...');
await runShellScript(buildScriptPath);
}
}
async function cargoLocateProject(config: CargoConfig) {
try {
const { stdout: projectDescriptionStr } = await execa(
'cargo',
['locate-project'],
config
);
const projectDescription = JSON.parse(projectDescriptionStr);
if (projectDescription != null && projectDescription.root != null) {
return projectDescription.root;
}
} catch (e) {
if (!/could not find/g.test(e.stderr)) {
console.error("Couldn't run `cargo locate-project`");
throw e;
}
}
return null;
}
async function buildSingleFile(
{ workPath, entrypoint, config }: BuildOptions,
downloadedFiles: DownloadedFiles,
extraFiles: DownloadedFiles,
rustEnv: Record<string, string>
) {
console.log('building single file');
const launcherPath = path.join(__dirname, '..', 'launcher.rs');
let launcherData = await fs.readFile(launcherPath, 'utf8');
const entrypointPath = downloadedFiles[entrypoint].fsPath;
const entrypointDirname = path.dirname(entrypointPath);
launcherData = launcherData.replace(
'// PLACEHOLDER',
await fs.readFile(path.join(workPath, entrypoint), 'utf8')
);
// replace the entrypoint with one that includes the the imports + lambda.start
await fs.remove(entrypointPath);
await fs.writeFile(entrypointPath, launcherData);
// Find a Cargo.toml file or TODO: create one
const cargoTomlFile = await cargoLocateProject({
env: rustEnv,
cwd: entrypointDirname,
});
// TODO: we're assuming there's a Cargo.toml file. We need to create one
// otherwise
let cargoToml: CargoToml;
try {
cargoToml = (await parseTOMLStream(
fs.createReadStream(cargoTomlFile)
)) as CargoToml;
} catch (err) {
console.error('Failed to parse TOML from entrypoint:', entrypoint);
throw err;
}
const binName = path
.basename(entrypointPath)
.replace(path.extname(entrypointPath), '');
const { package: pkg, dependencies } = cargoToml;
// default to latest now_lambda
dependencies.now_lambda = '*';
const tomlToWrite = toml.stringify({
package: pkg,
dependencies,
bin: [
{
name: binName,
path: entrypointPath,
},
],
});
console.log('toml to write:', tomlToWrite);
// Overwrite the Cargo.toml file with one that includes the `now_lambda`
// dependency and our binary. `dependencies` is a map so we don't run the
// risk of having 2 `now_lambda`s in there.
await fs.writeFile(cargoTomlFile, tomlToWrite);
const { debug } = config;
console.log('running `cargo build`...');
try {
await execa(
'cargo',
['build', '--bin', binName, '--verbose'].concat(
debug ? [] : ['--release']
),
{
env: rustEnv,
cwd: entrypointDirname,
stdio: 'inherit',
}
);
} catch (err) {
console.error('failed to `cargo build`');
throw err;
}
const bin = path.join(
path.dirname(cargoTomlFile),
'target',
debug ? 'debug' : 'release',
binName
);
const lambda = await createLambda({
files: {
...extraFiles,
bootstrap: new FileFsRef({ mode: 0o755, fsPath: bin }),
},
handler: 'bootstrap',
runtime: 'provided',
});
return {
[entrypoint]: lambda,
};
}
export async function build(opts: BuildOptions) {
const { files, entrypoint, workPath, config, meta = {} } = opts;
console.log('downloading files');
const downloadedFiles = await download(files, workPath, meta);
const entryPath = downloadedFiles[entrypoint].fsPath;
const { PATH, HOME } = process.env;
const rustEnv: Record<string, string> = {
...process.env,
PATH: `${path.join(HOME!, '.cargo/bin')}:${PATH}`,
RUSTFLAGS: [process.env.RUSTFLAGS, ...codegenFlags]
.filter(Boolean)
.join(' '),
};
await runUserScripts(entryPath);
const extraFiles = await gatherExtraFiles(config.includeFiles, entryPath);
if (path.extname(entrypoint) === '.toml') {
return buildWholeProject(opts, downloadedFiles, extraFiles, rustEnv);
}
return buildSingleFile(opts, downloadedFiles, extraFiles, rustEnv);
}
export async function prepareCache({
cachePath,
entrypoint,
workPath,
}: PrepareCacheOptions) {
console.log('preparing cache...');
let targetFolderDir: string;
if (path.extname(entrypoint) === '.toml') {
targetFolderDir = path.dirname(path.join(workPath, entrypoint));
} else {
const { PATH, HOME } = process.env;
const rustEnv: Record<string, string> = {
...process.env,
PATH: `${path.join(HOME!, '.cargo/bin')}:${PATH}`,
RUSTFLAGS: [process.env.RUSTFLAGS, ...codegenFlags]
.filter(Boolean)
.join(' '),
};
const entrypointDirname = path.dirname(path.join(workPath, entrypoint));
const cargoTomlFile = await cargoLocateProject({
env: rustEnv,
cwd: entrypointDirname,
});
if (cargoTomlFile != null) {
targetFolderDir = path.dirname(cargoTomlFile);
} else {
// `Cargo.toml` doesn't exist, in `build` we put it in the same
// path as the entrypoint.
targetFolderDir = path.dirname(path.join(workPath, entrypoint));
}
}
const cacheEntrypointDirname = path.join(
cachePath,
path.relative(workPath, targetFolderDir)
);
// Remove the target folder to avoid 'directory already exists' errors
fs.removeSync(path.join(cacheEntrypointDirname, 'target'));
fs.mkdirpSync(cacheEntrypointDirname);
// Move the target folder to the cache location
fs.renameSync(
path.join(targetFolderDir, 'target'),
path.join(cacheEntrypointDirname, 'target')
);
const cacheFiles = await glob('**/**', cachePath);
// eslint-disable-next-line no-restricted-syntax
for (const f of Object.keys(cacheFiles)) {
const accept =
/(?:^|\/)target\/release\/\.fingerprint\//.test(f) ||
/(?:^|\/)target\/release\/build\//.test(f) ||
/(?:^|\/)target\/release\/deps\//.test(f) ||
/(?:^|\/)target\/debug\/\.fingerprint\//.test(f) ||
/(?:^|\/)target\/debug\/build\//.test(f) ||
/(?:^|\/)target\/debug\/deps\//.test(f);
if (!accept) {
delete cacheFiles[f];
}
}
return cacheFiles;
}
function findCargoToml(
files: BuildOptions['files'],
entrypoint: BuildOptions['entrypoint']
) {
let currentPath = path.dirname(entrypoint);
let cargoTomlPath;
// eslint-disable-next-line no-constant-condition
while (true) {
cargoTomlPath = path.join(currentPath, 'Cargo.toml');
if (files[cargoTomlPath]) break;
const newPath = path.dirname(currentPath);
if (currentPath === newPath) break;
currentPath = newPath;
}
return cargoTomlPath;
}
export const getDefaultCache = ({ files, entrypoint }: BuildOptions) => {
const cargoTomlPath = findCargoToml(files, entrypoint);
if (!cargoTomlPath) return undefined;
const targetFolderDir = path.dirname(cargoTomlPath);
const defaultCacheRef = new FileRef({
digest:
'sha:204e0c840c43473bbd130d7bc704fe5588b4eab43cda9bc940f10b2a0ae14b16',
});
return { [targetFolderDir]: defaultCacheRef };
};
export { shouldServe } from '@now/build-utils';

View File

@@ -1,40 +0,0 @@
import execa from 'execa';
async function downloadRustToolchain(version: string = 'stable') {
console.log('downloading the rust toolchain');
try {
await execa.shell(
`curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --default-toolchain ${version}`,
{ stdio: 'inherit' }
);
} catch (err) {
throw new Error(`Failed to install rust via rustup: ${err.message}`);
}
}
async function installOpenSSL() {
console.log('installing openssl-devel...');
try {
// need to downgrade otherwise yum can't resolve the dependencies given
// a later version is already installed in the machine.
await execa(
'yum',
['downgrade', '-y', 'krb5-libs-1.14.1-27.41.amzn1.x86_64'],
{
stdio: 'inherit',
}
);
await execa('yum', ['install', '-y', 'openssl-devel'], {
stdio: 'inherit',
});
} catch (err) {
console.error('failed to `yum install -y openssl-devel`');
throw err;
}
}
export default async (version?: string) => {
await downloadRustToolchain(version);
await installOpenSSL();
};

View File

@@ -1,6 +0,0 @@
import installRust from './install-rust';
installRust().catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,91 +0,0 @@
pub use http::{self, Response};
use lambda_runtime::{self as lambda, Context};
use log::{self, debug, error};
use serde_json::Error;
use tokio::runtime::Runtime as TokioRuntime;
mod body;
pub mod error;
pub mod request;
mod response;
mod strmap;
pub use crate::{body::Body, response::IntoResponse, strmap::StrMap};
use crate::{
error::NowError,
request::{NowEvent, NowRequest},
response::NowResponse,
};
/// Type alias for `http::Request`s with a fixed `now_lambda::Body` body
pub type Request = http::Request<Body>;
/// Functions acting as Now Lambda handlers must conform to this type.
pub trait Handler<R, B, E> {
/// Method to execute the handler function
fn run(&mut self, event: http::Request<B>) -> Result<R, E>;
}
impl<Function, R, B, E> Handler<R, B, E> for Function
where
Function: FnMut(http::Request<B>) -> Result<R, E>,
{
fn run(&mut self, event: http::Request<B>) -> Result<R, E> {
(*self)(event)
}
}
/// Creates a new `lambda_runtime::Runtime` and begins polling for Now Lambda events
///
/// # Arguments
///
/// * `f` A type that conforms to the `Handler` interface.
///
/// # Panics
/// The function panics if the Lambda environment variables are not set.
pub fn start<R, B, E>(f: impl Handler<R, B, E>, runtime: Option<TokioRuntime>)
where
B: From<Body>,
E: Into<NowError>,
R: IntoResponse,
{
// handler requires a mutable ref
let mut func = f;
lambda::start(
|e: NowEvent, _ctx: Context| {
let req_str = e.body;
let parse_result: Result<NowRequest, Error> = serde_json::from_str(&req_str);
match parse_result {
Ok(req) => {
debug!("Deserialized Now proxy request successfully");
let request: http::Request<Body> = req.into();
func.run(request.map(|b| b.into()))
.map(|resp| NowResponse::from(resp.into_response()))
.map_err(|e| e.into())
}
Err(e) => {
error!("Could not deserialize event body to NowRequest {}", e);
panic!("Could not deserialize event body to NowRequest {}", e);
}
}
},
runtime,
)
}
/// A macro for starting new handler's poll for Now Lambda events
#[macro_export]
macro_rules! lambda {
($handler:expr) => {
$crate::start($handler, None)
};
($handler:expr, $runtime:expr) => {
$crate::start($handler, Some($runtime))
};
($handler:ident) => {
$crate::start($handler, None)
};
($handler:ident, $runtime:expr) => {
$crate::start($handler, Some($runtime))
};
}

View File

@@ -1,122 +0,0 @@
use std::{borrow::Cow, fmt, mem};
use http::{self, header::HeaderValue, HeaderMap, Method, Request as HttpRequest};
use serde::de::{Deserializer, Error as DeError, MapAccess, Visitor};
use serde_derive::Deserialize;
use crate::body::Body;
/// Representation of a Now Lambda proxy event data
#[doc(hidden)]
#[derive(Deserialize, Debug, Default)]
#[serde(rename_all = "camelCase")]
pub(crate) struct NowRequest<'a> {
pub(crate) host: Cow<'a, str>,
pub(crate) path: Cow<'a, str>,
#[serde(deserialize_with = "deserialize_method")]
pub(crate) method: Method,
#[serde(deserialize_with = "deserialize_headers")]
pub(crate) headers: HeaderMap<HeaderValue>,
pub(crate) body: Option<Cow<'a, str>>,
pub(crate) encoding: Option<String>,
}
#[doc(hidden)]
#[derive(Deserialize, Debug, Default)]
pub(crate) struct NowEvent<'a> {
#[serde(rename = "Action")]
action: Cow<'a, str>,
pub(crate) body: Cow<'a, str>,
}
fn deserialize_method<'de, D>(deserializer: D) -> Result<Method, D::Error>
where
D: Deserializer<'de>,
{
struct MethodVisitor;
impl<'de> Visitor<'de> for MethodVisitor {
type Value = Method;
fn expecting(&self, formatter: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(formatter, "a Method")
}
fn visit_str<E>(self, v: &str) -> Result<Self::Value, E>
where
E: DeError,
{
v.parse().map_err(E::custom)
}
}
deserializer.deserialize_str(MethodVisitor)
}
fn deserialize_headers<'de, D>(deserializer: D) -> Result<HeaderMap<HeaderValue>, D::Error>
where
D: Deserializer<'de>,
{
struct HeaderVisitor;
impl<'de> Visitor<'de> for HeaderVisitor {
type Value = HeaderMap<HeaderValue>;
fn expecting(&self, formatter: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(formatter, "a HeaderMap<HeaderValue>")
}
fn visit_map<A>(self, mut map: A) -> Result<Self::Value, A::Error>
where
A: MapAccess<'de>,
{
let mut headers = http::HeaderMap::new();
while let Some((key, value)) = map.next_entry::<Cow<'_, str>, Cow<'_, str>>()? {
let header_name = key
.parse::<http::header::HeaderName>()
.map_err(A::Error::custom)?;
let header_value =
http::header::HeaderValue::from_shared(value.into_owned().into())
.map_err(A::Error::custom)?;
headers.append(header_name, header_value);
}
Ok(headers)
}
}
deserializer.deserialize_map(HeaderVisitor)
}
impl<'a> From<NowRequest<'a>> for HttpRequest<Body> {
fn from(value: NowRequest<'_>) -> Self {
let NowRequest {
host,
path,
method,
headers,
body,
encoding,
} = value;
// build an http::Request<now_lambda::Body> from a now_lambda::NowRequest
let mut builder = HttpRequest::builder();
builder.method(method);
builder.uri({ format!("https://{}{}", host, path) });
let mut req = builder
.body(match (body, encoding) {
(Some(ref b), Some(ref encoding)) if encoding == "base64" => {
// todo: document failure behavior
Body::from(::base64::decode(b.as_ref()).unwrap_or_default())
}
(Some(b), _) => Body::from(b.into_owned()),
_ => Body::from(()),
})
.expect("failed to build request");
// no builder method that sets headers in batch
mem::replace(req.headers_mut(), headers);
req
}
}

View File

@@ -1,122 +0,0 @@
//! Response types
use http::{
header::{HeaderMap, HeaderValue},
Response,
};
use serde::ser::{Error as SerError, SerializeMap, Serializer};
use serde_derive::Serialize;
use crate::body::Body;
/// Representation of a Now Lambda response
#[derive(Serialize, Debug)]
#[serde(rename_all = "camelCase")]
pub(crate) struct NowResponse {
pub status_code: u16,
#[serde(
skip_serializing_if = "HeaderMap::is_empty",
serialize_with = "serialize_headers"
)]
pub headers: HeaderMap<HeaderValue>,
#[serde(skip_serializing_if = "Option::is_none")]
pub body: Option<Body>,
#[serde(skip_serializing_if = "Option::is_none")]
pub encoding: Option<String>,
}
impl Default for NowResponse {
fn default() -> Self {
Self {
status_code: 200,
headers: Default::default(),
body: Default::default(),
encoding: Default::default(),
}
}
}
fn serialize_headers<S>(headers: &HeaderMap<HeaderValue>, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let mut map = serializer.serialize_map(Some(headers.keys_len()))?;
for key in headers.keys() {
let map_value = headers[key].to_str().map_err(S::Error::custom)?;
map.serialize_entry(key.as_str(), map_value)?;
}
map.end()
}
impl<T> From<Response<T>> for NowResponse
where
T: Into<Body>,
{
fn from(value: Response<T>) -> Self {
let (parts, bod) = value.into_parts();
let (encoding, body) = match bod.into() {
Body::Empty => (None, None),
b @ Body::Text(_) => (None, Some(b)),
b @ Body::Binary(_) => (Some("base64".to_string()), Some(b)),
};
NowResponse {
status_code: parts.status.as_u16(),
body,
headers: parts.headers,
encoding,
}
}
}
/// A conversion of self into a `Response`
///
/// Implementations for `Response<B> where B: Into<Body>`,
/// `B where B: Into<Body>` and `serde_json::Value` are provided
/// by default
///
/// # example
///
/// ```rust
/// use now_lambda::{Body, IntoResponse, Response};
///
/// assert_eq!(
/// "hello".into_response().body(),
/// Response::new(Body::from("hello")).body()
/// );
/// ```
pub trait IntoResponse {
/// Return a translation of `self` into a `Response<Body>`
fn into_response(self) -> Response<Body>;
}
impl<B> IntoResponse for Response<B>
where
B: Into<Body>,
{
fn into_response(self) -> Response<Body> {
let (parts, body) = self.into_parts();
Response::from_parts(parts, body.into())
}
}
impl<B> IntoResponse for B
where
B: Into<Body>,
{
fn into_response(self) -> Response<Body> {
Response::new(self.into())
}
}
impl IntoResponse for serde_json::Value {
fn into_response(self) -> Response<Body> {
Response::builder()
.header(http::header::CONTENT_TYPE, "application/json")
.body(
serde_json::to_string(&self)
.expect("unable to serialize serde_json::Value")
.into(),
)
.expect("unable to build http::Response")
}
}

View File

@@ -1,90 +0,0 @@
use std::{
collections::{hash_map::Keys, HashMap},
fmt,
sync::Arc,
};
use serde::de::{Deserialize, Deserializer, MapAccess, Visitor};
/// A read-only view into a map of string data
#[derive(Default, Debug, PartialEq)]
pub struct StrMap(pub(crate) Arc<HashMap<String, String>>);
impl StrMap {
/// Return a named value where available
pub fn get(&self, key: &str) -> Option<&str> {
self.0.get(key).map(|value| value.as_ref())
}
/// Return true if the underlying map is empty
pub fn is_empty(&self) -> bool {
self.0.is_empty()
}
/// Return an iterator over keys and values
pub fn iter(&self) -> StrMapIter<'_> {
StrMapIter {
data: self,
keys: self.0.keys(),
}
}
}
impl Clone for StrMap {
fn clone(&self) -> Self {
// only clone the inner data
StrMap(self.0.clone())
}
}
impl From<HashMap<String, String>> for StrMap {
fn from(inner: HashMap<String, String>) -> Self {
StrMap(Arc::new(inner))
}
}
/// A read only reference to `StrMap` key and value slice pairings
pub struct StrMapIter<'a> {
data: &'a StrMap,
keys: Keys<'a, String, String>,
}
impl<'a> Iterator for StrMapIter<'a> {
type Item = (&'a str, &'a str);
#[inline]
fn next(&mut self) -> Option<(&'a str, &'a str)> {
self.keys
.next()
.and_then(|k| self.data.get(k).map(|v| (k.as_str(), v)))
}
}
impl<'de> Deserialize<'de> for StrMap {
fn deserialize<D>(deserializer: D) -> Result<StrMap, D::Error>
where
D: Deserializer<'de>,
{
struct StrMapVisitor;
impl<'de> Visitor<'de> for StrMapVisitor {
type Value = StrMap;
fn expecting(&self, formatter: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(formatter, "a StrMap")
}
fn visit_map<A>(self, mut map: A) -> Result<Self::Value, A::Error>
where
A: MapAccess<'de>,
{
let mut inner = HashMap::new();
while let Some((key, value)) = map.next_entry()? {
inner.insert(key, value);
}
Ok(StrMap(Arc::new(inner)))
}
}
deserializer.deserialize_map(StrMapVisitor)
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +0,0 @@
[package]
name = "index"
version = "1.0.0"
authors = ["Mike Engel <mike@mike-engel.com>"]
edition = "2018"
[dependencies]
http = "0.1"
now_lambda = "0.1"

View File

@@ -1,18 +0,0 @@
{
"version": 2,
"builds": [
{
"src": "Cargo.toml",
"use": "@now/rust",
"config": {
"includeFiles": ["static/**/*.txt", "static/**/*.svg"]
}
}
],
"probes": [
{
"path": "/",
"mustContain": "Include me in the lambda fs!"
}
]
}

View File

@@ -1,19 +0,0 @@
use http::{StatusCode};
use now_lambda::{error::NowError, lambda, IntoResponse, Request, Response};
use std::error::Error;
use std::fs::read_to_string;
fn handler(_: Request) -> Result<impl IntoResponse, NowError> {
let text = read_to_string("./static/sample.txt").unwrap();
let response = Response::builder()
.status(StatusCode::OK)
.header("Content-Type", "text/plain")
.body(text)
.expect("Internal Server Error");
Ok(response)
}
fn main() -> Result<(), Box<dyn Error>> {
Ok(lambda!(handler))
}

View File

@@ -1 +0,0 @@
this is just an extra file!

Some files were not shown because too many files have changed in this diff Show More