mirror of
https://github.com/LukeHagar/vercel.git
synced 2025-12-12 04:22:14 +00:00
Compare commits
67 Commits
@now/php@0
...
@now/pytho
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
183b117152 | ||
|
|
75b3fb4981 | ||
|
|
49e63de5fe | ||
|
|
4742cd32f2 | ||
|
|
377b73105d | ||
|
|
a5577efb3d | ||
|
|
2ec46dc5c9 | ||
|
|
42708ed93c | ||
|
|
2fabe95f6e | ||
|
|
ac1a3dab22 | ||
|
|
ad4011512d | ||
|
|
9ff1a25c8f | ||
|
|
8039b3d377 | ||
|
|
dd9017475c | ||
|
|
031499014f | ||
|
|
2a68d2a2ad | ||
|
|
31299fae6e | ||
|
|
4bac0db379 | ||
|
|
95e7d459d3 | ||
|
|
dd120b8d20 | ||
|
|
b6975676e5 | ||
|
|
a7951dae81 | ||
|
|
b0c918f7fb | ||
|
|
df54dc7dc9 | ||
|
|
0dd801ff6c | ||
|
|
398743ef95 | ||
|
|
337c74b81b | ||
|
|
680bb82ec3 | ||
|
|
17ed5411e3 | ||
|
|
d9bbcb6939 | ||
|
|
800e4de76f | ||
|
|
864dd468d9 | ||
|
|
ba833871bb | ||
|
|
e732bac78e | ||
|
|
28ea4015b4 | ||
|
|
a93d97cabd | ||
|
|
67f39f7c9b | ||
|
|
acd793b9e9 | ||
|
|
f74d61279d | ||
|
|
fcb8eacec0 | ||
|
|
c8fca2ba72 | ||
|
|
4feffa13eb | ||
|
|
3e330b25f4 | ||
|
|
9b2cae33af | ||
|
|
4b6371530c | ||
|
|
9e1d577fc0 | ||
|
|
cf2f542c71 | ||
|
|
e608861e4e | ||
|
|
a99b999209 | ||
|
|
fd9c6e7847 | ||
|
|
b2ad3a6147 | ||
|
|
997d3c2a30 | ||
|
|
ca575bf0a6 | ||
|
|
4c2e93ccef | ||
|
|
4d6437d235 | ||
|
|
0d8058d062 | ||
|
|
2b5cdfc0a7 | ||
|
|
69a41f78fb | ||
|
|
a013d59d62 | ||
|
|
173a29cfdb | ||
|
|
3f73451311 | ||
|
|
2fc706be43 | ||
|
|
0fb7eb6093 | ||
|
|
aa43c0bc87 | ||
|
|
3c5925a6e3 | ||
|
|
9fc7b047f5 | ||
|
|
ecae29457f |
@@ -1,3 +1,4 @@
|
||||
/tmp/*
|
||||
/node_modules/*
|
||||
/**/node_modules/*
|
||||
/packages/now-go/go/*
|
||||
|
||||
2
.gitignore
vendored
2
.gitignore
vendored
@@ -1,2 +1,4 @@
|
||||
node_modules
|
||||
tmp
|
||||
target/
|
||||
.next
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
This is the full list of official Builders provided by the ZEIT team.
|
||||
|
||||
More details here: http://zeit.co/docs
|
||||
More details here: https://zeit.co/docs/v2/deployments/builders/overview/
|
||||
|
||||
### Publishing to npm
|
||||
|
||||
|
||||
73
errors/now-next-legacy-mode.md
Normal file
73
errors/now-next-legacy-mode.md
Normal file
@@ -0,0 +1,73 @@
|
||||
# `@now/next` Legacy Mode
|
||||
|
||||
#### Why This Warning Occurred
|
||||
|
||||
`@now/next` has two modes: `legacy` and `serverless`. You will always want to use the `serverless` mode. `legacy` is to provide backwards compatibility with previous `@now/next` versions.
|
||||
|
||||
The differences:
|
||||
|
||||
Legacy:
|
||||
|
||||
- Minimal lambda size of `2.2Mb` (approximately)
|
||||
- Forces `next@v7.0.2-canary.49` and `next-server@v7.0.2-canary.49`
|
||||
- Forces all `dependencies` to be `devDependencies`
|
||||
- Loads `next.config.js` on bootup, breaking sometimes when users didn't use `phases` to load files
|
||||
- Used `next-server` which is the full Next.js server with routing etc.
|
||||
- Runs `npm install`
|
||||
- Runs `npm run now-build`
|
||||
- Runs `npm install --production` after build
|
||||
|
||||
Serverless:
|
||||
|
||||
- Minimal lambda size of `49Kb` (approximately)
|
||||
- Uses Next.js build targets (`target: 'serverless'`) in `next.config.js`. [documentation](https://github.com/zeit/next.js#summary)
|
||||
- Does not make changes to your application dependencies
|
||||
- Does not load `next.config.js` ([as per the serverless target documentation](https://github.com/zeit/next.js#summary))
|
||||
- Runs `npm install`
|
||||
- Runs `npm run now-build`
|
||||
- Does not run `npm install --production` as the output from the build is all that's needed to bundle lambdas.
|
||||
- No runtime dependencies, meaning smaller lambda functions
|
||||
- Optimized for fast [cold start](https://zeit.co/blog/serverless-ssr#cold-start)
|
||||
|
||||
|
||||
#### Possible Ways to Fix It
|
||||
|
||||
In order to create the smallest possible lambdas Next.js has to be configured to build for the `serverless` target.
|
||||
|
||||
1. Serverless Next.js requires Next.js 8 or later, to upgrade you can install the `latest` version:
|
||||
|
||||
```
|
||||
npm install next --save
|
||||
```
|
||||
|
||||
2. Add the `now-build` script to your `package.json`
|
||||
|
||||
```json
|
||||
{
|
||||
"scripts": {
|
||||
"now-build": "next build"
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
3. Add `target: 'serverless'` to `next.config.js`
|
||||
|
||||
```js
|
||||
module.exports = {
|
||||
target: 'serverless'
|
||||
// Other options are still valid
|
||||
}
|
||||
```
|
||||
|
||||
4. Optionally make sure the `"src"` in `"builds"` points to your application `package.json`
|
||||
|
||||
```js
|
||||
{
|
||||
"version": 2,
|
||||
"builds": [{ "src": "package.json", "use": "@now/next" }]
|
||||
}
|
||||
```
|
||||
|
||||
### Useful Links
|
||||
|
||||
- [Serverless target implementation](https://github.com/zeit/now-builders/pull/150)
|
||||
43
errors/now-next-no-serverless-pages-built.md
Normal file
43
errors/now-next-no-serverless-pages-built.md
Normal file
@@ -0,0 +1,43 @@
|
||||
# `@now/next` No Serverless Pages Built
|
||||
|
||||
#### Why This Error Occurred
|
||||
|
||||
This error occurs when you have your application is not configured for Serverless Next.js build output.
|
||||
|
||||
#### Possible Ways to Fix It
|
||||
|
||||
In order to create the smallest possible lambdas Next.js has to be configured to build for the `serverless` target.
|
||||
|
||||
1. Serverless Next.js requires Next.js 8 or later, to upgrade you can install the `latest` version:
|
||||
|
||||
```
|
||||
npm install next --save
|
||||
```
|
||||
|
||||
2. Add the `now-build` script to your `package.json`
|
||||
|
||||
```json
|
||||
{
|
||||
"scripts": {
|
||||
"now-build": "next build"
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
3. Add `target: 'serverless'` to `next.config.js`
|
||||
|
||||
```js
|
||||
module.exports = {
|
||||
target: 'serverless'
|
||||
// Other options are still valid
|
||||
}
|
||||
```
|
||||
|
||||
4. Optionally make sure the `"src"` in `"builds"` points to your application `package.json`
|
||||
|
||||
```js
|
||||
{
|
||||
"version": 2,
|
||||
"builds": [{ "src": "package.json", "use": "@now/next" }]
|
||||
}
|
||||
```
|
||||
19
lerna.json
19
lerna.json
@@ -2,24 +2,7 @@
|
||||
"npmClient": "yarn",
|
||||
"useWorkspaces": true,
|
||||
"packages": [
|
||||
"packages/now-build-utils",
|
||||
"packages/now-node-bridge",
|
||||
"packages/now-php-bridge",
|
||||
"packages/now-bash",
|
||||
"packages/now-cgi",
|
||||
"packages/now-go",
|
||||
"packages/now-html-minifier",
|
||||
"packages/now-lambda",
|
||||
"packages/now-md",
|
||||
"packages/now-mdx-deck",
|
||||
"packages/now-next",
|
||||
"packages/now-node",
|
||||
"packages/now-node-server",
|
||||
"packages/now-optipng",
|
||||
"packages/now-php",
|
||||
"packages/now-python",
|
||||
"packages/now-static-build",
|
||||
"packages/now-wordpress"
|
||||
"packages/*"
|
||||
],
|
||||
"command": {
|
||||
"publish": {
|
||||
|
||||
@@ -15,7 +15,7 @@
|
||||
"publish-stable": "lerna version",
|
||||
"publish-canary": "lerna version prerelease --preid canary",
|
||||
"lint": "tsc && eslint .",
|
||||
"test": "jest --runInBand",
|
||||
"test": "jest --runInBand --verbose",
|
||||
"lint-staged": "lint-staged"
|
||||
},
|
||||
"pre-commit": "lint-staged",
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
const execa = require('execa');
|
||||
const { join } = require('path');
|
||||
const snakeCase = require('snake-case');
|
||||
const glob = require('@now/build-utils/fs/glob');
|
||||
const download = require('@now/build-utils/fs/download');
|
||||
const { createLambda } = require('@now/build-utils/lambda');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory');
|
||||
const glob = require('@now/build-utils/fs/glob'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { createLambda } = require('@now/build-utils/lambda'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
exports.config = {
|
||||
maxLambdaSize: '10mb',
|
||||
|
||||
@@ -1,10 +1,15 @@
|
||||
{
|
||||
"name": "@now/bash",
|
||||
"version": "0.1.1",
|
||||
"version": "0.1.2",
|
||||
"description": "Now 2.0 builder for HTTP endpoints written in Bash",
|
||||
"main": "index.js",
|
||||
"author": "Nathan Rajlich <nate@zeit.co>",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-bash"
|
||||
},
|
||||
"files": [
|
||||
"builder.sh",
|
||||
"runtime.sh",
|
||||
@@ -15,8 +20,5 @@
|
||||
"dependencies": {
|
||||
"execa": "^1.0.0",
|
||||
"snake-case": "^2.1.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -52,7 +52,7 @@ _lambda_runtime_next() {
|
||||
# Need to use a fifo here instead of bash <() because Lambda
|
||||
# errors with "/dev/fd/63 not found" for some reason :/
|
||||
local stdin
|
||||
stdin="$(mktemp --dry-run)"
|
||||
stdin="$(mktemp -u)"
|
||||
mkfifo "$stdin"
|
||||
_lambda_runtime_body "$event" > "$stdin" &
|
||||
|
||||
|
||||
@@ -1 +1,2 @@
|
||||
/test
|
||||
tmp
|
||||
@@ -26,6 +26,18 @@ class FileFsRef {
|
||||
this.fsPath = fsPath;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a `FileFsRef` with the correct `mode` from the file system.
|
||||
*
|
||||
* @argument {Object} options
|
||||
* @argument {string} options.fsPath
|
||||
* @returns {Promise<FileFsRef>}
|
||||
*/
|
||||
static async fromFsPath({ fsPath }) {
|
||||
const { mode } = await fs.lstat(fsPath);
|
||||
return new FileFsRef({ mode, fsPath });
|
||||
}
|
||||
|
||||
/**
|
||||
* @argument {Object} options
|
||||
* @argument {number} [options.mode=0o100644]
|
||||
|
||||
@@ -49,7 +49,7 @@ class FileRef {
|
||||
assert(url);
|
||||
|
||||
await semaToDownloadFromS3.acquire();
|
||||
console.time(`downloading ${url}`);
|
||||
// console.time(`downloading ${url}`);
|
||||
try {
|
||||
return await retry(
|
||||
async () => {
|
||||
@@ -66,7 +66,7 @@ class FileRef {
|
||||
{ factor: 1, retries: 3 },
|
||||
);
|
||||
} finally {
|
||||
console.timeEnd(`downloading ${url}`);
|
||||
// console.timeEnd(`downloading ${url}`);
|
||||
semaToDownloadFromS3.release();
|
||||
}
|
||||
}
|
||||
|
||||
94
packages/now-build-utils/fs/bootstrap-yarn.js
vendored
94
packages/now-build-utils/fs/bootstrap-yarn.js
vendored
@@ -1,94 +0,0 @@
|
||||
const MemoryFileSystem = require('memory-fs');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { spawnSync } = require('child_process');
|
||||
|
||||
const yarnPath = spawnSync('which', ['yarn'])
|
||||
.stdout.toString()
|
||||
.trim();
|
||||
|
||||
const cachePath = spawnSync(yarnPath, ['cache', 'dir'])
|
||||
.stdout.toString()
|
||||
.trim();
|
||||
|
||||
spawnSync(yarnPath, ['cache', 'clean']);
|
||||
const vfs = new MemoryFileSystem();
|
||||
|
||||
function isInsideCachePath(filename) {
|
||||
const relative = path.relative(cachePath, filename);
|
||||
return !relative.startsWith('..');
|
||||
}
|
||||
|
||||
const saveCreateWriteStream = fs.createWriteStream;
|
||||
fs.createWriteStream = (...args) => {
|
||||
const filename = args[0];
|
||||
if (!isInsideCachePath(filename)) {
|
||||
return saveCreateWriteStream.call(fs, ...args);
|
||||
}
|
||||
|
||||
vfs.mkdirpSync(path.dirname(filename));
|
||||
fs.writeFileSync(filename, Buffer.alloc(0));
|
||||
const stream = vfs.createWriteStream(...args);
|
||||
|
||||
stream.on('finish', () => {
|
||||
setTimeout(() => {
|
||||
stream.emit('close');
|
||||
});
|
||||
});
|
||||
|
||||
return stream;
|
||||
};
|
||||
|
||||
const saveReadFile = fs.readFile;
|
||||
fs.readFile = (...args) => {
|
||||
const filename = args[0];
|
||||
if (!isInsideCachePath(filename)) {
|
||||
return saveReadFile.call(fs, ...args);
|
||||
}
|
||||
|
||||
const callback = args[args.length - 1];
|
||||
return vfs.readFile(...args.slice(0, -1), (error, result) => {
|
||||
if (error) {
|
||||
saveReadFile.call(fs, ...args);
|
||||
return;
|
||||
}
|
||||
|
||||
callback(error, result);
|
||||
});
|
||||
};
|
||||
|
||||
const saveCopyFile = fs.copyFile;
|
||||
fs.copyFile = (...args) => {
|
||||
const src = args[0];
|
||||
const dest = args[1];
|
||||
const callback = args[args.length - 1];
|
||||
|
||||
if (isInsideCachePath(src) && !isInsideCachePath(dest)) {
|
||||
const buffer = vfs.readFileSync(src);
|
||||
return fs.writeFile(dest, buffer, callback);
|
||||
}
|
||||
|
||||
if (!isInsideCachePath(src) && isInsideCachePath(dest)) {
|
||||
const buffer = fs.readFileSync(src);
|
||||
|
||||
vfs.mkdirpSync(path.dirname(dest));
|
||||
fs.writeFileSync(dest, Buffer.alloc(0));
|
||||
return vfs.writeFile(dest, buffer, callback);
|
||||
}
|
||||
|
||||
return saveCopyFile.call(fs, ...args);
|
||||
};
|
||||
|
||||
const saveWriteFile = fs.writeFile;
|
||||
fs.writeFile = (...args) => {
|
||||
const filename = args[0];
|
||||
if (!isInsideCachePath(filename)) {
|
||||
return saveWriteFile.call(fs, ...args);
|
||||
}
|
||||
|
||||
vfs.mkdirpSync(path.dirname(filename));
|
||||
fs.writeFileSync(filename, Buffer.alloc(0));
|
||||
return vfs.writeFile(...args);
|
||||
};
|
||||
|
||||
require(yarnPath);
|
||||
@@ -3,9 +3,6 @@ const fs = require('fs-extra');
|
||||
const path = require('path');
|
||||
const { spawn } = require('child_process');
|
||||
|
||||
const prod = process.env.AWS_EXECUTION_ENV
|
||||
|| process.env.X_GOOGLE_CODE_LOCATION;
|
||||
|
||||
function spawnAsync(command, args, cwd) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const child = spawn(command, args, { stdio: 'inherit', cwd });
|
||||
@@ -66,15 +63,6 @@ async function installDependencies(destPath, args = []) {
|
||||
commandArgs = args.filter(a => a !== '--prefer-offline');
|
||||
await spawnAsync('npm', ['install'].concat(commandArgs), destPath);
|
||||
await spawnAsync('npm', ['cache', 'clean', '--force'], destPath);
|
||||
} else if (prod) {
|
||||
console.log('using memory-fs for yarn cache');
|
||||
await spawnAsync(
|
||||
'node',
|
||||
[path.join(__dirname, 'bootstrap-yarn.js'), '--cwd', destPath].concat(
|
||||
commandArgs,
|
||||
),
|
||||
destPath,
|
||||
);
|
||||
} else {
|
||||
await spawnAsync('yarn', ['--cwd', destPath].concat(commandArgs), destPath);
|
||||
await spawnAsync('yarn', ['cache', 'clean'], destPath);
|
||||
|
||||
@@ -1,7 +1,12 @@
|
||||
{
|
||||
"name": "@now/build-utils",
|
||||
"version": "0.4.32",
|
||||
"version": "0.4.35-canary.2",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-build-utils"
|
||||
},
|
||||
"dependencies": {
|
||||
"async-retry": "1.2.3",
|
||||
"async-sema": "2.1.4",
|
||||
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const builderUrl = '@canary';
|
||||
let buildUtilsUrl;
|
||||
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
const path = require('path');
|
||||
const { mkdirp, copyFile } = require('fs-extra');
|
||||
|
||||
const glob = require('@now/build-utils/fs/glob');
|
||||
const download = require('@now/build-utils/fs/download');
|
||||
const { createLambda } = require('@now/build-utils/lambda');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory');
|
||||
const glob = require('@now/build-utils/fs/glob'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { createLambda } = require('@now/build-utils/lambda'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
exports.analyze = ({ files, entrypoint }) => files[entrypoint].digest;
|
||||
|
||||
|
||||
@@ -1,7 +1,12 @@
|
||||
{
|
||||
"name": "@now/cgi",
|
||||
"version": "0.0.15",
|
||||
"version": "0.0.16-canary.0",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-cgi"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "best -I test/*.js",
|
||||
"prepublish": "./build.sh"
|
||||
@@ -16,8 +21,5 @@
|
||||
"devDependencies": {
|
||||
"@zeit/best": "0.4.3",
|
||||
"rmfr": "2.0.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
5
packages/now-go/.gitignore
vendored
5
packages/now-go/.gitignore
vendored
@@ -1,4 +1,5 @@
|
||||
node_modules
|
||||
*.log
|
||||
launcher
|
||||
bin
|
||||
/?.js
|
||||
/go
|
||||
/get-exported-function-name
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
mkdir -p bin
|
||||
cd util
|
||||
GOOS=linux GOARCH=amd64 go build get-exported-function-name.go
|
||||
mv get-exported-function-name ../bin/
|
||||
|
||||
@@ -1,23 +0,0 @@
|
||||
const path = require('path');
|
||||
|
||||
const fetch = require('node-fetch');
|
||||
const tar = require('tar');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js');
|
||||
|
||||
const url = 'https://dl.google.com/go/go1.11.1.linux-amd64.tar.gz';
|
||||
|
||||
module.exports = async () => {
|
||||
const res = await fetch(url);
|
||||
const dir = await getWritableDirectory();
|
||||
|
||||
if (!res.ok) {
|
||||
throw new Error(`Failed to download: ${url}`);
|
||||
}
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
res.body
|
||||
.on('error', reject)
|
||||
.pipe(tar.extract({ cwd: dir, strip: 1 }))
|
||||
.on('finish', () => resolve(path.join(dir, 'bin', 'go')));
|
||||
});
|
||||
};
|
||||
124
packages/now-go/go-helpers.js
Normal file
124
packages/now-go/go-helpers.js
Normal file
@@ -0,0 +1,124 @@
|
||||
const tar = require('tar');
|
||||
const execa = require('execa');
|
||||
const fetch = require('node-fetch');
|
||||
const { mkdirp } = require('fs-extra');
|
||||
const { dirname, join } = require('path');
|
||||
const debug = require('debug')('@now/go:go-helpers');
|
||||
|
||||
const archMap = new Map([['x64', 'amd64'], ['x86', '386']]);
|
||||
const platformMap = new Map([['win32', 'windows']]);
|
||||
|
||||
// Location where the `go` binary will be installed after `postinstall`
|
||||
const GO_DIR = join(__dirname, 'go');
|
||||
const GO_BIN = join(GO_DIR, 'bin/go');
|
||||
|
||||
const getPlatform = p => platformMap.get(p) || p;
|
||||
const getArch = a => archMap.get(a) || a;
|
||||
const getGoUrl = (version, platform, arch) => {
|
||||
const goArch = getArch(arch);
|
||||
const goPlatform = getPlatform(platform);
|
||||
const ext = platform === 'win32' ? 'zip' : 'tar.gz';
|
||||
return `https://dl.google.com/go/go${version}.${goPlatform}-${goArch}.${ext}`;
|
||||
};
|
||||
|
||||
function getExportedFunctionName(filePath) {
|
||||
debug('Detecting handler name for %o', filePath);
|
||||
const bin = join(__dirname, 'get-exported-function-name');
|
||||
const args = [filePath];
|
||||
const name = execa.stdout(bin, args);
|
||||
debug('Detected exported name %o', filePath);
|
||||
return name;
|
||||
}
|
||||
|
||||
// Creates a `$GOPATH` directory tree, as per `go help gopath` instructions.
|
||||
// Without this, `go` won't recognize the `$GOPATH`.
|
||||
function createGoPathTree(goPath, platform, arch) {
|
||||
const tuple = `${getPlatform(platform)}_${getArch(arch)}`;
|
||||
debug('Creating GOPATH directory structure for %o (%s)', goPath, tuple);
|
||||
return Promise.all([
|
||||
mkdirp(join(goPath, 'bin')),
|
||||
mkdirp(join(goPath, 'pkg', tuple)),
|
||||
]);
|
||||
}
|
||||
|
||||
async function get({ src } = {}) {
|
||||
const args = ['get'];
|
||||
if (src) {
|
||||
debug('Fetching `go` dependencies for file %o', src);
|
||||
args.push(src);
|
||||
} else {
|
||||
debug('Fetching `go` dependencies for cwd %o', this.cwd);
|
||||
}
|
||||
await this(...args);
|
||||
}
|
||||
|
||||
async function build({ src, dest }) {
|
||||
debug('Building `go` binary %o -> %o', src, dest);
|
||||
let sources;
|
||||
if (Array.isArray(src)) {
|
||||
sources = src;
|
||||
} else {
|
||||
sources = [src];
|
||||
}
|
||||
await this('build', '-o', dest, ...sources);
|
||||
}
|
||||
|
||||
async function createGo(
|
||||
goPath,
|
||||
platform = process.platform,
|
||||
arch = process.arch,
|
||||
opts = {},
|
||||
) {
|
||||
const env = {
|
||||
...process.env,
|
||||
PATH: `${dirname(GO_BIN)}:${process.env.PATH}`,
|
||||
GOPATH: goPath,
|
||||
...opts.env,
|
||||
};
|
||||
|
||||
function go(...args) {
|
||||
debug('Exec %o', `go ${args.join(' ')}`);
|
||||
return execa('go', args, { stdio: 'inherit', ...opts, env });
|
||||
}
|
||||
go.cwd = opts.cwd || process.cwd();
|
||||
go.get = get;
|
||||
go.build = build;
|
||||
go.goPath = goPath;
|
||||
await createGoPathTree(goPath, platform, arch);
|
||||
return go;
|
||||
}
|
||||
|
||||
async function downloadGo(
|
||||
dir = GO_DIR,
|
||||
version = '1.11.5',
|
||||
platform = process.platform,
|
||||
arch = process.arch,
|
||||
) {
|
||||
debug('Installing `go` v%s to %o for %s %s', version, dir, platform, arch);
|
||||
|
||||
const url = getGoUrl(version, platform, arch);
|
||||
debug('Downloading `go` URL: %o', url);
|
||||
const res = await fetch(url);
|
||||
|
||||
if (!res.ok) {
|
||||
throw new Error(`Failed to download: ${url} (${res.status})`);
|
||||
}
|
||||
|
||||
// TODO: use a zip extractor when `ext === "zip"`
|
||||
await mkdirp(dir);
|
||||
await new Promise((resolve, reject) => {
|
||||
res.body
|
||||
.on('error', reject)
|
||||
.pipe(tar.extract({ cwd: dir, strip: 1 }))
|
||||
.on('error', reject)
|
||||
.on('finish', resolve);
|
||||
});
|
||||
|
||||
return createGo(dir, platform, arch);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
createGo,
|
||||
downloadGo,
|
||||
getExportedFunctionName,
|
||||
};
|
||||
@@ -1,79 +1,52 @@
|
||||
const path = require('path');
|
||||
const { mkdirp, readFile, writeFile } = require('fs-extra');
|
||||
const { join, dirname } = require('path');
|
||||
const { readFile, writeFile } = require('fs-extra');
|
||||
|
||||
const execa = require('execa');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js');
|
||||
const download = require('@now/build-utils/fs/download.js');
|
||||
const downloadGit = require('lambda-git');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
const downloadGoBin = require('./download-go-bin');
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { createGo, getExportedFunctionName } = require('./go-helpers');
|
||||
|
||||
// creates a `$GOPATH` directory tree, as per
|
||||
// `go help gopath`'s instructions.
|
||||
// without this, Go won't recognize the `$GOPATH`
|
||||
async function createGoPathTree(goPath) {
|
||||
await mkdirp(path.join(goPath, 'bin'));
|
||||
await mkdirp(path.join(goPath, 'pkg', 'linux_amd64'));
|
||||
}
|
||||
|
||||
exports.config = {
|
||||
const config = {
|
||||
maxLambdaSize: '10mb',
|
||||
};
|
||||
|
||||
exports.build = async ({ files, entrypoint }) => {
|
||||
console.log('downloading files...');
|
||||
async function build({ files, entrypoint }) {
|
||||
console.log('Downloading user files...');
|
||||
|
||||
const gitPath = await getWritableDirectory();
|
||||
const goPath = await getWritableDirectory();
|
||||
const srcPath = path.join(goPath, 'src', 'lambda');
|
||||
const outDir = await getWritableDirectory();
|
||||
|
||||
await createGoPathTree(goPath);
|
||||
const [goPath, outDir] = await Promise.all([
|
||||
getWritableDirectory(),
|
||||
getWritableDirectory(),
|
||||
]);
|
||||
|
||||
const srcPath = join(goPath, 'src', 'lambda');
|
||||
const downloadedFiles = await download(files, srcPath);
|
||||
|
||||
console.log('downloading go binary...');
|
||||
const goBin = await downloadGoBin();
|
||||
|
||||
console.log('downloading git binary...');
|
||||
// downloads a git binary that works on Amazon Linux and sets
|
||||
// `process.env.GIT_EXEC_PATH` so `go(1)` can see it
|
||||
await downloadGit({ targetDirectory: gitPath });
|
||||
|
||||
const goEnv = {
|
||||
...process.env,
|
||||
GOOS: 'linux',
|
||||
GOARCH: 'amd64',
|
||||
GOPATH: goPath,
|
||||
};
|
||||
|
||||
console.log(`parsing AST for "${entrypoint}"`);
|
||||
let handlerFunctionName = '';
|
||||
console.log(`Parsing AST for "${entrypoint}"`);
|
||||
let handlerFunctionName;
|
||||
try {
|
||||
handlerFunctionName = await execa.stdout(
|
||||
path.join(__dirname, 'bin', 'get-exported-function-name'),
|
||||
[downloadedFiles[entrypoint].fsPath],
|
||||
handlerFunctionName = await getExportedFunctionName(
|
||||
downloadedFiles[entrypoint].fsPath,
|
||||
);
|
||||
} catch (err) {
|
||||
console.log(`failed to parse AST for "${entrypoint}"`);
|
||||
console.log(`Failed to parse AST for "${entrypoint}"`);
|
||||
throw err;
|
||||
}
|
||||
|
||||
if (handlerFunctionName === '') {
|
||||
const e = new Error(
|
||||
`Could not find an exported function on "${entrypoint}"`,
|
||||
if (!handlerFunctionName) {
|
||||
const err = new Error(
|
||||
`Could not find an exported function in "${entrypoint}"`,
|
||||
);
|
||||
console.log(e.message);
|
||||
throw e;
|
||||
console.log(err.message);
|
||||
throw err;
|
||||
}
|
||||
|
||||
console.log(
|
||||
`Found exported function "${handlerFunctionName}" on "${entrypoint}"`,
|
||||
`Found exported function "${handlerFunctionName}" in "${entrypoint}"`,
|
||||
);
|
||||
|
||||
const origianlMainGoContents = await readFile(
|
||||
path.join(__dirname, 'main.go'),
|
||||
join(__dirname, 'main.go'),
|
||||
'utf8',
|
||||
);
|
||||
const mainGoContents = origianlMainGoContents.replace(
|
||||
@@ -85,39 +58,33 @@ exports.build = async ({ files, entrypoint }) => {
|
||||
|
||||
// we need `main.go` in the same dir as the entrypoint,
|
||||
// otherwise `go build` will refuse to build
|
||||
const entrypointDirname = path.dirname(downloadedFiles[entrypoint].fsPath);
|
||||
const entrypointDirname = dirname(downloadedFiles[entrypoint].fsPath);
|
||||
|
||||
// Go doesn't like to build files in different directories,
|
||||
// so now we place `main.go` together with the user code
|
||||
await writeFile(path.join(entrypointDirname, mainGoFileName), mainGoContents);
|
||||
await writeFile(join(entrypointDirname, mainGoFileName), mainGoContents);
|
||||
|
||||
console.log('installing dependencies');
|
||||
// `go get` will look at `*.go` (note we set `cwd`), parse
|
||||
// the `import`s and download any packages that aren't part of the stdlib
|
||||
const go = await createGo(goPath, process.platform, process.arch, {
|
||||
cwd: entrypointDirname,
|
||||
});
|
||||
|
||||
// `go get` will look at `*.go` (note we set `cwd`), parse the `import`s
|
||||
// and download any packages that aren't part of the stdlib
|
||||
try {
|
||||
await execa(goBin, ['get'], {
|
||||
env: goEnv,
|
||||
cwd: entrypointDirname,
|
||||
stdio: 'inherit',
|
||||
});
|
||||
await go.get();
|
||||
} catch (err) {
|
||||
console.log('failed to `go get`');
|
||||
throw err;
|
||||
}
|
||||
|
||||
console.log('running go build...');
|
||||
console.log('Running `go build`...');
|
||||
const destPath = join(outDir, 'handler');
|
||||
try {
|
||||
await execa(
|
||||
goBin,
|
||||
[
|
||||
'build',
|
||||
'-o',
|
||||
path.join(outDir, 'handler'),
|
||||
path.join(entrypointDirname, mainGoFileName),
|
||||
downloadedFiles[entrypoint].fsPath,
|
||||
],
|
||||
{ env: goEnv, cwd: entrypointDirname, stdio: 'inherit' },
|
||||
);
|
||||
const src = [
|
||||
join(entrypointDirname, mainGoFileName),
|
||||
downloadedFiles[entrypoint].fsPath,
|
||||
];
|
||||
await go.build({ src, dest: destPath });
|
||||
} catch (err) {
|
||||
console.log('failed to `go build`');
|
||||
throw err;
|
||||
@@ -133,4 +100,6 @@ exports.build = async ({ files, entrypoint }) => {
|
||||
return {
|
||||
[entrypoint]: lambda,
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = { config, build };
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
now "../../utils/go/bridge"
|
||||
"net/http"
|
||||
now "github.com/zeit/now-builders/utils/go/bridge"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
@@ -1,30 +1,26 @@
|
||||
{
|
||||
"name": "@now/go",
|
||||
"version": "0.2.12",
|
||||
"version": "0.2.13-canary.1",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-go"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "best -I test/*.js",
|
||||
"prepublish": "./build.sh"
|
||||
"postinstall": "node ./util/install"
|
||||
},
|
||||
"files": [
|
||||
"bin",
|
||||
"download-go-bin.js",
|
||||
"index.js",
|
||||
"main.go"
|
||||
"*.js",
|
||||
"main.go",
|
||||
"util"
|
||||
],
|
||||
"dependencies": {
|
||||
"debug": "^4.1.1",
|
||||
"execa": "^1.0.0",
|
||||
"fs-extra": "^7.0.0",
|
||||
"lambda-git": "^0.1.2",
|
||||
"mkdirp-promise": "5.0.1",
|
||||
"node-fetch": "^2.2.1",
|
||||
"tar": "4.4.6"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@zeit/best": "0.4.3",
|
||||
"rmfr": "2.0.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
18
packages/now-go/util/install.js
Normal file
18
packages/now-go/util/install.js
Normal file
@@ -0,0 +1,18 @@
|
||||
const { join } = require('path');
|
||||
const { downloadGo } = require('../go-helpers');
|
||||
|
||||
async function main() {
|
||||
// First download the `go` binary for this platform/arch.
|
||||
const go = await downloadGo();
|
||||
|
||||
// Build the `get-exported-function-name` helper program.
|
||||
// `go get` is not necessary because the program has no external deps.
|
||||
const src = join(__dirname, 'get-exported-function-name.go');
|
||||
const dest = join(__dirname, '../get-exported-function-name');
|
||||
await go.build({ src, dest });
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,4 +1,4 @@
|
||||
const FileBlob = require('@now/build-utils/file-blob.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { minify } = require('html-minifier');
|
||||
|
||||
const defaultOptions = {
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
{
|
||||
"name": "@now/html-minifier",
|
||||
"version": "1.0.7",
|
||||
"version": "1.0.8-canary.0",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-html-minifier"
|
||||
},
|
||||
"dependencies": {
|
||||
"html-minifier": "3.5.21"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
const { Lambda } = require('@now/build-utils/lambda.js');
|
||||
const streamToBuffer = require('@now/build-utils/fs/stream-to-buffer.js');
|
||||
const { Lambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const streamToBuffer = require('@now/build-utils/fs/stream-to-buffer.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
exports.build = async ({ files, entrypoint, config }) => {
|
||||
if (!files[entrypoint]) throw new Error('Entrypoint not found in files');
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
{
|
||||
"name": "@now/lambda",
|
||||
"version": "0.4.9",
|
||||
"version": "0.4.10-canary.1",
|
||||
"license": "MIT",
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-lambda"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const buildUtilsUrl = '@canary';
|
||||
let builderUrl;
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
const FileBlob = require('@now/build-utils/file-blob.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const unified = require('unified');
|
||||
const unifiedStream = require('unified-stream');
|
||||
const markdown = require('remark-parse');
|
||||
|
||||
@@ -1,7 +1,12 @@
|
||||
{
|
||||
"name": "@now/md",
|
||||
"version": "0.4.9",
|
||||
"version": "0.4.10-canary.1",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-md"
|
||||
},
|
||||
"dependencies": {
|
||||
"rehype-document": "^2.2.0",
|
||||
"rehype-format": "^2.3.0",
|
||||
@@ -11,9 +16,6 @@
|
||||
"unified": "^7.0.0",
|
||||
"unified-stream": "^1.0.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
}
|
||||
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const buildUtilsUrl = '@canary';
|
||||
let builderUrl;
|
||||
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
const download = require('@now/build-utils/fs/download.js');
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const fs = require('fs');
|
||||
const { promisify } = require('util');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const path = require('path');
|
||||
const { runNpmInstall } = require('@now/build-utils/fs/run-user-scripts.js');
|
||||
const { runNpmInstall } = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
const writeFile = promisify(fs.writeFile);
|
||||
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
{
|
||||
"name": "@now/mdx-deck",
|
||||
"version": "0.4.18",
|
||||
"version": "0.4.19-canary.1",
|
||||
"license": "MIT",
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-mdx-deck"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const buildUtilsUrl = '@canary';
|
||||
let builderUrl;
|
||||
|
||||
|
||||
@@ -1,23 +1,21 @@
|
||||
const { createLambda } = require('@now/build-utils/lambda.js');
|
||||
const download = require('@now/build-utils/fs/download.js');
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileBlob = require('@now/build-utils/file-blob'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const path = require('path');
|
||||
const { readFile, writeFile, unlink } = require('fs.promised');
|
||||
const {
|
||||
runNpmInstall,
|
||||
runPackageJsonScript,
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const semver = require('semver');
|
||||
const nextLegacyVersions = require('./legacy-versions');
|
||||
const {
|
||||
excludeFiles,
|
||||
validateEntrypoint,
|
||||
includeOnlyEntryDirectory,
|
||||
moveEntryDirectoryToRoot,
|
||||
normalizePackageJson,
|
||||
excludeStaticDirectory,
|
||||
onlyStaticDirectory,
|
||||
} = require('./utils');
|
||||
|
||||
@@ -33,15 +31,17 @@ const {
|
||||
|
||||
/**
|
||||
* Read package.json from files
|
||||
* @param {DownloadedFiles} files
|
||||
* @param {string} entryPath
|
||||
*/
|
||||
async function readPackageJson(files) {
|
||||
if (!files['package.json']) {
|
||||
async function readPackageJson(entryPath) {
|
||||
const packagePath = path.join(entryPath, 'package.json');
|
||||
|
||||
try {
|
||||
return JSON.parse(await readFile(packagePath, 'utf8'));
|
||||
} catch (err) {
|
||||
console.log('package.json not found in entry');
|
||||
return {};
|
||||
}
|
||||
|
||||
const packageJsonPath = files['package.json'].fsPath;
|
||||
return JSON.parse(await readFile(packageJsonPath, 'utf8'));
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -81,20 +81,10 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
|
||||
console.log('downloading user files...');
|
||||
const entryDirectory = path.dirname(entrypoint);
|
||||
const filesOnlyEntryDirectory = includeOnlyEntryDirectory(
|
||||
files,
|
||||
entryDirectory,
|
||||
);
|
||||
const filesWithEntryDirectoryRoot = moveEntryDirectoryToRoot(
|
||||
filesOnlyEntryDirectory,
|
||||
entryDirectory,
|
||||
);
|
||||
const filesWithoutStaticDirectory = excludeStaticDirectory(
|
||||
filesWithEntryDirectoryRoot,
|
||||
);
|
||||
const downloadedFiles = await download(filesWithoutStaticDirectory, workPath);
|
||||
await download(files, workPath);
|
||||
const entryPath = path.join(workPath, entryDirectory);
|
||||
|
||||
const pkg = await readPackageJson(downloadedFiles);
|
||||
const pkg = await readPackageJson(entryPath);
|
||||
|
||||
let nextVersion;
|
||||
if (pkg.dependencies && pkg.dependencies.next) {
|
||||
@@ -133,24 +123,25 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
|
||||
if (isLegacy) {
|
||||
try {
|
||||
await unlink(path.join(workPath, 'yarn.lock'));
|
||||
await unlink(path.join(entryPath, 'yarn.lock'));
|
||||
} catch (err) {
|
||||
console.log('no yarn.lock removed');
|
||||
}
|
||||
|
||||
try {
|
||||
await unlink(path.join(workPath, 'package-lock.json'));
|
||||
await unlink(path.join(entryPath, 'package-lock.json'));
|
||||
} catch (err) {
|
||||
console.log('no package-lock.json removed');
|
||||
}
|
||||
|
||||
console.warn(
|
||||
"WARNING: your application is being deployed in @now/next's legacy mode.",
|
||||
"WARNING: your application is being deployed in @now/next's legacy mode. http://err.sh/zeit/now-builders/now-next-legacy-mode",
|
||||
);
|
||||
|
||||
console.log('normalizing package.json');
|
||||
const packageJson = normalizePackageJson(pkg);
|
||||
console.log('normalized package.json result: ', packageJson);
|
||||
await writePackageJson(workPath, packageJson);
|
||||
await writePackageJson(entryPath, packageJson);
|
||||
} else if (!pkg.scripts || !pkg.scripts['now-build']) {
|
||||
console.warn(
|
||||
'WARNING: "now-build" script not found. Adding \'"now-build": "next build"\' to "package.json" automatically',
|
||||
@@ -160,38 +151,38 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
...(pkg.scripts || {}),
|
||||
};
|
||||
console.log('normalized package.json result: ', pkg);
|
||||
await writePackageJson(workPath, pkg);
|
||||
await writePackageJson(entryPath, pkg);
|
||||
}
|
||||
|
||||
if (process.env.NPM_AUTH_TOKEN) {
|
||||
console.log('found NPM_AUTH_TOKEN in environment, creating .npmrc');
|
||||
await writeNpmRc(workPath, process.env.NPM_AUTH_TOKEN);
|
||||
await writeNpmRc(entryPath, process.env.NPM_AUTH_TOKEN);
|
||||
}
|
||||
|
||||
console.log('installing dependencies...');
|
||||
await runNpmInstall(workPath, ['--prefer-offline']);
|
||||
await runNpmInstall(entryPath, ['--prefer-offline']);
|
||||
console.log('running user script...');
|
||||
await runPackageJsonScript(workPath, 'now-build');
|
||||
await runPackageJsonScript(entryPath, 'now-build');
|
||||
|
||||
if (isLegacy) {
|
||||
console.log('running npm install --production...');
|
||||
await runNpmInstall(workPath, ['--prefer-offline', '--production']);
|
||||
await runNpmInstall(entryPath, ['--prefer-offline', '--production']);
|
||||
}
|
||||
|
||||
if (process.env.NPM_AUTH_TOKEN) {
|
||||
await unlink(path.join(workPath, '.npmrc'));
|
||||
await unlink(path.join(entryPath, '.npmrc'));
|
||||
}
|
||||
|
||||
const lambdas = {};
|
||||
|
||||
if (isLegacy) {
|
||||
const filesAfterBuild = await glob('**', workPath);
|
||||
const filesAfterBuild = await glob('**', entryPath);
|
||||
|
||||
console.log('preparing lambda files...');
|
||||
let buildId;
|
||||
try {
|
||||
buildId = await readFile(
|
||||
path.join(workPath, '.next', 'BUILD_ID'),
|
||||
path.join(entryPath, '.next', 'BUILD_ID'),
|
||||
'utf8',
|
||||
);
|
||||
} catch (err) {
|
||||
@@ -200,10 +191,10 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
);
|
||||
throw new Error('Missing BUILD_ID');
|
||||
}
|
||||
const dotNextRootFiles = await glob('.next/*', workPath);
|
||||
const dotNextServerRootFiles = await glob('.next/server/*', workPath);
|
||||
const dotNextRootFiles = await glob('.next/*', entryPath);
|
||||
const dotNextServerRootFiles = await glob('.next/server/*', entryPath);
|
||||
const nodeModules = excludeFiles(
|
||||
await glob('node_modules/**', workPath),
|
||||
await glob('node_modules/**', entryPath),
|
||||
file => file.startsWith('node_modules/.cache'),
|
||||
);
|
||||
const launcherFiles = {
|
||||
@@ -220,7 +211,7 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
}
|
||||
const pages = await glob(
|
||||
'**/*.js',
|
||||
path.join(workPath, '.next', 'server', 'static', buildId, 'pages'),
|
||||
path.join(entryPath, '.next', 'server', 'static', buildId, 'pages'),
|
||||
);
|
||||
const launcherPath = path.join(__dirname, 'legacy-launcher.js');
|
||||
const launcherData = await readFile(launcherPath, 'utf8');
|
||||
@@ -276,7 +267,7 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
};
|
||||
const pages = await glob(
|
||||
'**/*.js',
|
||||
path.join(workPath, '.next', 'serverless', 'pages'),
|
||||
path.join(entryPath, '.next', 'serverless', 'pages'),
|
||||
);
|
||||
|
||||
const pageKeys = Object.keys(pages);
|
||||
@@ -312,7 +303,7 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
|
||||
const nextStaticFiles = await glob(
|
||||
'**',
|
||||
path.join(workPath, '.next', 'static'),
|
||||
path.join(entryPath, '.next', 'static'),
|
||||
);
|
||||
const staticFiles = Object.keys(nextStaticFiles).reduce(
|
||||
(mappedFiles, file) => ({
|
||||
@@ -322,7 +313,9 @@ exports.build = async ({ files, workPath, entrypoint }) => {
|
||||
{},
|
||||
);
|
||||
|
||||
const nextStaticDirectory = onlyStaticDirectory(filesWithEntryDirectoryRoot);
|
||||
const nextStaticDirectory = onlyStaticDirectory(
|
||||
includeOnlyEntryDirectory(files, entryDirectory),
|
||||
);
|
||||
const staticDirectoryFiles = Object.keys(nextStaticDirectory).reduce(
|
||||
(mappedFiles, file) => ({
|
||||
...mappedFiles,
|
||||
|
||||
@@ -1,14 +1,16 @@
|
||||
{
|
||||
"name": "@now/next",
|
||||
"version": "0.0.84",
|
||||
"version": "0.0.85-canary.5",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-next"
|
||||
},
|
||||
"dependencies": {
|
||||
"@now/node-bridge": "0.1.4",
|
||||
"execa": "^1.0.0",
|
||||
"fs.promised": "^3.0.0",
|
||||
"semver": "^5.6.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
const rename = require('@now/build-utils/fs/rename.js');
|
||||
|
||||
/** @typedef { import('@now/build-utils/file-ref') } FileRef */
|
||||
/** @typedef { import('@now/build-utils/file-fs-ref') } FileFsRef */
|
||||
/** @typedef {{[filePath: string]: FileRef|FileFsRef}} Files */
|
||||
@@ -64,24 +62,6 @@ function includeOnlyEntryDirectory(files, entryDirectory) {
|
||||
return excludeFiles(files, matcher);
|
||||
}
|
||||
|
||||
/**
|
||||
* Moves all files under the entry directory to the root directory
|
||||
* @param {Files} files
|
||||
* @param {string} entryDirectory
|
||||
* @returns {Files}
|
||||
*/
|
||||
function moveEntryDirectoryToRoot(files, entryDirectory) {
|
||||
if (entryDirectory === '.') {
|
||||
return files;
|
||||
}
|
||||
|
||||
function delegate(filePath) {
|
||||
return filePath.replace(new RegExp(`^${entryDirectory}/`), '');
|
||||
}
|
||||
|
||||
return rename(files, delegate);
|
||||
}
|
||||
|
||||
/**
|
||||
* Exclude package manager lockfiles from files
|
||||
* @param {Files} files
|
||||
@@ -98,19 +78,6 @@ function excludeLockFiles(files) {
|
||||
return files;
|
||||
}
|
||||
|
||||
/**
|
||||
* Exclude the static directory from files
|
||||
* @param {Files} files
|
||||
* @returns {Files}
|
||||
*/
|
||||
function excludeStaticDirectory(files) {
|
||||
function matcher(filePath) {
|
||||
return filePath.startsWith('static');
|
||||
}
|
||||
|
||||
return excludeFiles(files, matcher);
|
||||
}
|
||||
|
||||
/**
|
||||
* Exclude the static directory from files
|
||||
* @param {Files} files
|
||||
@@ -173,9 +140,7 @@ module.exports = {
|
||||
excludeFiles,
|
||||
validateEntrypoint,
|
||||
includeOnlyEntryDirectory,
|
||||
moveEntryDirectoryToRoot,
|
||||
excludeLockFiles,
|
||||
normalizePackageJson,
|
||||
excludeStaticDirectory,
|
||||
onlyStaticDirectory,
|
||||
};
|
||||
|
||||
@@ -1,8 +1,10 @@
|
||||
{
|
||||
"name": "@now/node-bridge",
|
||||
"version": "0.1.10",
|
||||
"version": "0.1.11-canary.0",
|
||||
"license": "MIT",
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-node-bridge"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,15 +1,15 @@
|
||||
const { createLambda } = require('@now/build-utils/lambda.js');
|
||||
const download = require('@now/build-utils/fs/download.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js');
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const fs = require('fs-extra');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const path = require('path');
|
||||
const rename = require('@now/build-utils/fs/rename.js');
|
||||
const rename = require('@now/build-utils/fs/rename.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const {
|
||||
runNpmInstall,
|
||||
runPackageJsonScript,
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js');
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
/** @typedef { import('@now/build-utils/file-ref') } FileRef */
|
||||
/** @typedef {{[filePath: string]: FileRef}} Files */
|
||||
@@ -37,7 +37,7 @@ async function downloadInstallAndBundle(
|
||||
console.log('downloading user files...');
|
||||
const downloadedFiles = await download(files, userPath);
|
||||
|
||||
console.log('installing dependencies for user\'s code...');
|
||||
console.log("installing dependencies for user's code...");
|
||||
const entrypointFsDirname = path.join(userPath, path.dirname(entrypoint));
|
||||
await runNpmInstall(entrypointFsDirname, npmArguments);
|
||||
|
||||
@@ -46,8 +46,9 @@ async function downloadInstallAndBundle(
|
||||
{
|
||||
'package.json': new FileBlob({
|
||||
data: JSON.stringify({
|
||||
license: 'UNLICENSED',
|
||||
dependencies: {
|
||||
'@zeit/ncc': '0.11.0',
|
||||
'@zeit/ncc': '0.15.2',
|
||||
},
|
||||
}),
|
||||
}),
|
||||
@@ -63,7 +64,7 @@ async function downloadInstallAndBundle(
|
||||
async function compile(workNccPath, downloadedFiles, entrypoint) {
|
||||
const input = downloadedFiles[entrypoint].fsPath;
|
||||
const ncc = require(path.join(workNccPath, 'node_modules/@zeit/ncc'));
|
||||
const { code, assets } = await ncc(input);
|
||||
const { code, assets } = await ncc(input, { sourceMap: true });
|
||||
|
||||
const preparedFiles = {};
|
||||
const blob = new FileBlob({ data: code });
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
{
|
||||
"name": "@now/node-server",
|
||||
"version": "0.4.26",
|
||||
"version": "0.4.27-canary.5",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@now/node-bridge": "^0.1.10",
|
||||
"fs-extra": "7.0.1"
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-node-server"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
"dependencies": {
|
||||
"@now/node-bridge": "^0.1.11-canary.0",
|
||||
"fs-extra": "7.0.1"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const buildUtilsUrl = '@canary';
|
||||
let builderUrl;
|
||||
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
const { createLambda } = require('@now/build-utils/lambda.js');
|
||||
const download = require('@now/build-utils/fs/download.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js');
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const fs = require('fs-extra');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const path = require('path');
|
||||
const {
|
||||
runNpmInstall,
|
||||
runPackageJsonScript,
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js');
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
/** @typedef { import('@now/build-utils/file-ref') } FileRef */
|
||||
/** @typedef {{[filePath: string]: FileRef}} Files */
|
||||
@@ -35,7 +35,7 @@ async function downloadInstallAndBundle(
|
||||
console.log('downloading user files...');
|
||||
const downloadedFiles = await download(files, userPath);
|
||||
|
||||
console.log('installing dependencies for user\'s code...');
|
||||
console.log("installing dependencies for user's code...");
|
||||
const entrypointFsDirname = path.join(userPath, path.dirname(entrypoint));
|
||||
await runNpmInstall(entrypointFsDirname, npmArguments);
|
||||
|
||||
@@ -44,8 +44,9 @@ async function downloadInstallAndBundle(
|
||||
{
|
||||
'package.json': new FileBlob({
|
||||
data: JSON.stringify({
|
||||
license: 'UNLICENSED',
|
||||
dependencies: {
|
||||
'@zeit/ncc': '0.11.0',
|
||||
'@zeit/ncc': '0.15.2',
|
||||
},
|
||||
}),
|
||||
}),
|
||||
@@ -61,7 +62,7 @@ async function downloadInstallAndBundle(
|
||||
async function compile(workNccPath, downloadedFiles, entrypoint) {
|
||||
const input = downloadedFiles[entrypoint].fsPath;
|
||||
const ncc = require(path.join(workNccPath, 'node_modules/@zeit/ncc'));
|
||||
const { code, assets } = await ncc(input);
|
||||
const { code, assets } = await ncc(input, { sourceMap: true });
|
||||
|
||||
const preparedFiles = {};
|
||||
const blob = new FileBlob({ data: code });
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
{
|
||||
"name": "@now/node",
|
||||
"version": "0.4.28",
|
||||
"version": "0.4.29-canary.5",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@now/node-bridge": "^0.1.10",
|
||||
"fs-extra": "7.0.1"
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-node"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
"dependencies": {
|
||||
"@now/node-bridge": "^0.1.11-canary.0",
|
||||
"fs-extra": "7.0.1"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const buildUtilsUrl = '@canary';
|
||||
let builderUrl;
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
const FileBlob = require('@now/build-utils/file-blob.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const OptiPng = require('optipng');
|
||||
const pipe = require('multipipe');
|
||||
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
{
|
||||
"name": "@now/optipng",
|
||||
"version": "0.4.8",
|
||||
"version": "0.4.9-canary.0",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-optipng"
|
||||
},
|
||||
"dependencies": {
|
||||
"multipipe": "2.0.3",
|
||||
"optipng": "1.1.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,23 +1,38 @@
|
||||
const FileBlob = require('@now/build-utils/file-blob.js');
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const path = require('path');
|
||||
|
||||
async function getFiles() {
|
||||
const files = await glob('native/**', __dirname);
|
||||
|
||||
const phpConfig = await FileBlob.fromStream({ stream: files['native/php.ini'].toStream() });
|
||||
phpConfig.data = phpConfig.data.toString()
|
||||
const phpConfig = await FileBlob.fromStream({
|
||||
stream: files['native/php.ini'].toStream(),
|
||||
});
|
||||
phpConfig.data = phpConfig.data
|
||||
.toString()
|
||||
.replace(/\/root\/app\/modules/g, '/var/task/native/modules');
|
||||
files['native/php.ini'] = phpConfig;
|
||||
|
||||
Object.assign(files, {
|
||||
'fastcgi/connection.js': new FileFsRef({ fsPath: require.resolve('fastcgi-client/lib/connection.js') }),
|
||||
'fastcgi/consts.js': new FileFsRef({ fsPath: require.resolve('fastcgi-client/lib/consts.js') }),
|
||||
'fastcgi/stringifykv.js': new FileFsRef({ fsPath: require.resolve('fastcgi-client/lib/stringifykv.js') }),
|
||||
'fastcgi/index.js': new FileFsRef({ fsPath: path.join(__dirname, 'fastcgi/index.js') }),
|
||||
'fastcgi/port.js': new FileFsRef({ fsPath: path.join(__dirname, 'fastcgi/port.js') }),
|
||||
'launcher.js': new FileFsRef({ fsPath: path.join(__dirname, 'launcher.js') }),
|
||||
'fastcgi/connection.js': new FileFsRef({
|
||||
fsPath: require.resolve('fastcgi-client/lib/connection.js'),
|
||||
}),
|
||||
'fastcgi/consts.js': new FileFsRef({
|
||||
fsPath: require.resolve('fastcgi-client/lib/consts.js'),
|
||||
}),
|
||||
'fastcgi/stringifykv.js': new FileFsRef({
|
||||
fsPath: require.resolve('fastcgi-client/lib/stringifykv.js'),
|
||||
}),
|
||||
'fastcgi/index.js': new FileFsRef({
|
||||
fsPath: path.join(__dirname, 'fastcgi/index.js'),
|
||||
}),
|
||||
'fastcgi/port.js': new FileFsRef({
|
||||
fsPath: path.join(__dirname, 'fastcgi/port.js'),
|
||||
}),
|
||||
'launcher.js': new FileFsRef({
|
||||
fsPath: path.join(__dirname, 'launcher.js'),
|
||||
}),
|
||||
});
|
||||
|
||||
return files;
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
{
|
||||
"name": "@now/php-bridge",
|
||||
"version": "0.4.13",
|
||||
"version": "0.4.14-canary.0",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-php-bridge"
|
||||
},
|
||||
"dependencies": {
|
||||
"fastcgi-client": "0.0.1"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
const { createLambda } = require('@now/build-utils/lambda.js');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const path = require('path');
|
||||
const rename = require('@now/build-utils/fs/rename.js');
|
||||
const rename = require('@now/build-utils/fs/rename.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { getFiles } = require('@now/php-bridge');
|
||||
|
||||
exports.config = {
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
{
|
||||
"name": "@now/php",
|
||||
"version": "0.4.13",
|
||||
"version": "0.4.14-canary.1",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@now/php-bridge": "^0.4.13"
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-php"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
"dependencies": {
|
||||
"@now/php-bridge": "^0.4.14-canary.0"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const buildUtilsUrl = '@canary';
|
||||
let builderUrl;
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ const fetch = require('node-fetch');
|
||||
const execa = require('execa');
|
||||
const { createWriteStream } = require('fs');
|
||||
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
const url = 'https://bootstrap.pypa.io/get-pip.py';
|
||||
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
const path = require('path');
|
||||
const execa = require('execa');
|
||||
const { readFile, writeFile } = require('fs.promised');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js');
|
||||
const download = require('@now/build-utils/fs/download.js');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js');
|
||||
const getWritableDirectory = require('@now/build-utils/fs/get-writable-directory.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const downloadAndInstallPip = require('./download-and-install-pip');
|
||||
|
||||
async function pipInstall(pipPath, srcDir, ...args) {
|
||||
@@ -40,8 +40,16 @@ exports.build = async ({ files, entrypoint }) => {
|
||||
|
||||
await pipInstall(pipPath, srcDir, 'requests');
|
||||
|
||||
if (files['requirements.txt']) {
|
||||
console.log('found "requirements.txt"');
|
||||
const entryDirectory = path.dirname(entrypoint);
|
||||
const requirementsTxt = path.join(entryDirectory, 'requirements.txt');
|
||||
|
||||
if (files[requirementsTxt]) {
|
||||
console.log('found local "requirements.txt"');
|
||||
|
||||
const requirementsTxtPath = files[requirementsTxt].fsPath;
|
||||
await pipInstall(pipPath, srcDir, '-r', requirementsTxtPath);
|
||||
} else if (files['requirements.txt']) {
|
||||
console.log('found global "requirements.txt"');
|
||||
|
||||
const requirementsTxtPath = files['requirements.txt'].fsPath;
|
||||
await pipInstall(pipPath, srcDir, '-r', requirementsTxtPath);
|
||||
|
||||
@@ -12,12 +12,20 @@ def now_handler(event, context):
|
||||
path = payload['path']
|
||||
headers = payload['headers']
|
||||
method = payload['method']
|
||||
|
||||
res = requests.request(method, 'http://0.0.0.0:3000' + path, headers=headers)
|
||||
encoding = payload.get('encoding')
|
||||
body = payload.get('body')
|
||||
|
||||
if (
|
||||
(body is not None and len(body) > 0) and
|
||||
(encoding is not None and encoding == 'base64')
|
||||
):
|
||||
body = base64.b64decode(body)
|
||||
|
||||
res = requests.request(method, 'http://0.0.0.0:3000' + path,
|
||||
headers=headers, data=body, allow_redirects=False)
|
||||
|
||||
return {
|
||||
'statusCode': res.status_code,
|
||||
'headers': dict(res.headers),
|
||||
'body': res.text
|
||||
}
|
||||
|
||||
|
||||
@@ -1,14 +1,16 @@
|
||||
{
|
||||
"name": "@now/python",
|
||||
"version": "0.0.40",
|
||||
"version": "0.0.41-canary.1",
|
||||
"main": "index.js",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-python"
|
||||
},
|
||||
"dependencies": {
|
||||
"execa": "^1.0.0",
|
||||
"fs.promised": "^3.0.0",
|
||||
"node-fetch": "^2.2.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
1101
packages/now-rust/Cargo.lock
generated
Normal file
1101
packages/now-rust/Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
25
packages/now-rust/Cargo.toml
Normal file
25
packages/now-rust/Cargo.toml
Normal file
@@ -0,0 +1,25 @@
|
||||
[package]
|
||||
name = "now_lambda"
|
||||
version = "0.1.2"
|
||||
authors = ["Antonio Nuno Monteiro <anmonteiro@gmail.com>"]
|
||||
edition = "2018"
|
||||
description = "Rust bindings for Now.sh Lambdas"
|
||||
keywords = ["AWS", "Lambda", "Zeit", "Now", "Rust"]
|
||||
license = "MIT"
|
||||
homepage = "https://github.com/zeit/now-builders"
|
||||
repository = "https://github.com/zeit/now-builders"
|
||||
documentation = "https://docs.rs/now_lambda"
|
||||
include = [
|
||||
"src/*.rs",
|
||||
"Cargo.toml"
|
||||
]
|
||||
|
||||
[dependencies]
|
||||
serde = "^1"
|
||||
serde_json = "^1"
|
||||
serde_derive = "^1"
|
||||
http = "0.1"
|
||||
tokio = "^0.1"
|
||||
base64 = "0.10"
|
||||
log = "^0.4"
|
||||
lambda_runtime = "0.2.0"
|
||||
84
packages/now-rust/download-install-rust-toolchain.js
Normal file
84
packages/now-rust/download-install-rust-toolchain.js
Normal file
@@ -0,0 +1,84 @@
|
||||
const tar = require('tar');
|
||||
const fetch = require('node-fetch');
|
||||
const execa = require('execa');
|
||||
|
||||
const rustUrl = 'https://dmmcy0pwk6bqi.cloudfront.net/rust.tar.gz';
|
||||
const ccUrl = 'https://dmmcy0pwk6bqi.cloudfront.net/gcc-4.8.5.tgz';
|
||||
|
||||
async function downloadRustToolchain() {
|
||||
console.log('downloading the rust toolchain');
|
||||
const res = await fetch(rustUrl);
|
||||
|
||||
if (!res.ok) {
|
||||
throw new Error(`Failed to download: ${rustUrl}`);
|
||||
}
|
||||
|
||||
const { HOME } = process.env;
|
||||
return new Promise((resolve, reject) => {
|
||||
res.body
|
||||
.on('error', reject)
|
||||
.pipe(tar.extract({ gzip: true, cwd: HOME }))
|
||||
.on('finish', () => resolve());
|
||||
});
|
||||
}
|
||||
|
||||
async function downloadGCC() {
|
||||
console.log('downloading GCC');
|
||||
const res = await fetch(ccUrl);
|
||||
|
||||
if (!res.ok) {
|
||||
throw new Error(`Failed to download: ${ccUrl}`);
|
||||
}
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
res.body
|
||||
.on('error', reject)
|
||||
// NOTE(anmonteiro): We pipe GCC into `/tmp` instead of getting a writable
|
||||
// directory from `@now/build-utils` because the GCC distribution that we
|
||||
// use is specifically packaged for AWS Lambda (where `/tmp` is writable)
|
||||
// and contains several hardcoded symlinks to paths in `/tmp`.
|
||||
.pipe(tar.extract({ gzip: true, cwd: '/tmp' }))
|
||||
.on('finish', async () => {
|
||||
const { LD_LIBRARY_PATH } = process.env;
|
||||
// Set the environment variables as per
|
||||
// https://github.com/lambci/lambci/blob/e6c9c7/home/init/gcc#L14-L17
|
||||
const newEnv = {
|
||||
PATH: '/tmp/bin:/tmp/sbin',
|
||||
LD_LIBRARY_PATH: `/tmp/lib:/tmp/lib64:${LD_LIBRARY_PATH}`,
|
||||
CPATH: '/tmp/include',
|
||||
LIBRARY_PATH: '/tmp/lib',
|
||||
};
|
||||
|
||||
return resolve(newEnv);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async function installOpenSSL() {
|
||||
console.log('installing openssl-devel...');
|
||||
try {
|
||||
// need to downgrade otherwise yum can't resolve the dependencies given
|
||||
// a later version is already installed in the machine.
|
||||
await execa(
|
||||
'yum',
|
||||
['downgrade', '-y', 'krb5-libs-1.14.1-27.41.amzn1.x86_64'],
|
||||
{
|
||||
stdio: 'inherit',
|
||||
},
|
||||
);
|
||||
await execa('yum', ['install', '-y', 'openssl-devel'], {
|
||||
stdio: 'inherit',
|
||||
});
|
||||
} catch (err) {
|
||||
console.error('failed to `yum install -y openssl-devel`');
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = async () => {
|
||||
await downloadRustToolchain();
|
||||
const newEnv = await downloadGCC();
|
||||
await installOpenSSL();
|
||||
|
||||
return newEnv;
|
||||
};
|
||||
270
packages/now-rust/index.js
Normal file
270
packages/now-rust/index.js
Normal file
@@ -0,0 +1,270 @@
|
||||
const fs = require('fs-extra');
|
||||
const path = require('path');
|
||||
const execa = require('execa');
|
||||
const toml = require('@iarna/toml');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const FileFsRef = require('@now/build-utils/file-fs-ref.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const installRustAndGCC = require('./download-install-rust-toolchain.js');
|
||||
|
||||
exports.config = {
|
||||
maxLambdaSize: '25mb',
|
||||
};
|
||||
|
||||
async function inferCargoBinaries(config) {
|
||||
try {
|
||||
const { stdout: manifestStr } = await execa(
|
||||
'cargo',
|
||||
['read-manifest'],
|
||||
config,
|
||||
);
|
||||
|
||||
const { targets } = JSON.parse(manifestStr);
|
||||
|
||||
return targets
|
||||
.filter(({ kind }) => kind.includes('bin'))
|
||||
.map(({ name }) => name);
|
||||
} catch (err) {
|
||||
console.error('failed to run `cargo read-manifest`');
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async function parseTOMLStream(stream) {
|
||||
return toml.parse.stream(stream);
|
||||
}
|
||||
|
||||
async function buildWholeProject({
|
||||
entrypoint,
|
||||
downloadedFiles,
|
||||
rustEnv,
|
||||
config,
|
||||
}) {
|
||||
const entrypointDirname = path.dirname(downloadedFiles[entrypoint].fsPath);
|
||||
const { debug } = config;
|
||||
console.log('running `cargo build`...');
|
||||
try {
|
||||
await execa('cargo', ['build'].concat(debug ? [] : ['--release']), {
|
||||
env: rustEnv,
|
||||
cwd: entrypointDirname,
|
||||
stdio: 'inherit',
|
||||
});
|
||||
} catch (err) {
|
||||
console.error('failed to `cargo build`');
|
||||
throw err;
|
||||
}
|
||||
|
||||
const targetPath = path.join(
|
||||
entrypointDirname,
|
||||
'target',
|
||||
debug ? 'debug' : 'release',
|
||||
);
|
||||
const binaries = await inferCargoBinaries({
|
||||
env: rustEnv,
|
||||
cwd: entrypointDirname,
|
||||
});
|
||||
|
||||
const lambdas = {};
|
||||
const lambdaPath = path.dirname(entrypoint);
|
||||
await Promise.all(
|
||||
binaries.map(async (binary) => {
|
||||
const fsPath = path.join(targetPath, binary);
|
||||
const lambda = await createLambda({
|
||||
files: {
|
||||
bootstrap: new FileFsRef({ mode: 0o755, fsPath }),
|
||||
},
|
||||
handler: 'bootstrap',
|
||||
runtime: 'provided',
|
||||
});
|
||||
|
||||
lambdas[path.join(lambdaPath, binary)] = lambda;
|
||||
}),
|
||||
);
|
||||
|
||||
return lambdas;
|
||||
}
|
||||
|
||||
async function cargoLocateProject(config) {
|
||||
try {
|
||||
const { stdout: projectDescriptionStr } = await execa(
|
||||
'cargo',
|
||||
['locate-project'],
|
||||
config,
|
||||
);
|
||||
const projectDescription = JSON.parse(projectDescriptionStr);
|
||||
if (projectDescription != null && projectDescription.root != null) {
|
||||
return projectDescription.root;
|
||||
}
|
||||
} catch (e) {
|
||||
if (!/could not find/g.test(e.stderr)) {
|
||||
console.error("Couldn't run `cargo locate-project`");
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
async function buildSingleFile({
|
||||
workPath,
|
||||
entrypoint,
|
||||
downloadedFiles,
|
||||
rustEnv,
|
||||
config,
|
||||
}) {
|
||||
console.log('building single file');
|
||||
const launcherPath = path.join(__dirname, 'launcher.rs');
|
||||
let launcherData = await fs.readFile(launcherPath, 'utf8');
|
||||
|
||||
const entrypointPath = downloadedFiles[entrypoint].fsPath;
|
||||
const entrypointDirname = path.dirname(entrypointPath);
|
||||
launcherData = launcherData.replace(
|
||||
'// PLACEHOLDER',
|
||||
await fs.readFile(path.join(workPath, entrypoint)),
|
||||
);
|
||||
// replace the entrypoint with one that includes the the imports + lambda.start
|
||||
await fs.remove(entrypointPath);
|
||||
await fs.writeFile(entrypointPath, launcherData);
|
||||
|
||||
// Find a Cargo.toml file or TODO: create one
|
||||
const cargoTomlFile = await cargoLocateProject({
|
||||
env: rustEnv,
|
||||
cwd: entrypointDirname,
|
||||
});
|
||||
|
||||
// TODO: we're assuming there's a Cargo.toml file. We need to create one
|
||||
// otherwise
|
||||
let cargoToml;
|
||||
try {
|
||||
cargoToml = await parseTOMLStream(fs.createReadStream(cargoTomlFile));
|
||||
} catch (err) {
|
||||
console.error('Failed to parse TOML from entrypoint:', entrypoint);
|
||||
throw err;
|
||||
}
|
||||
|
||||
const binName = path
|
||||
.basename(entrypointPath)
|
||||
.replace(path.extname(entrypointPath), '');
|
||||
const { package: pkg, dependencies } = cargoToml;
|
||||
// default to latest now_lambda
|
||||
dependencies.now_lambda = '*';
|
||||
const tomlToWrite = toml.stringify({
|
||||
package: pkg,
|
||||
dependencies,
|
||||
bin: [
|
||||
{
|
||||
name: binName,
|
||||
path: entrypointPath,
|
||||
},
|
||||
],
|
||||
});
|
||||
console.log('toml to write:', tomlToWrite);
|
||||
|
||||
// Overwrite the Cargo.toml file with one that includes the `now_lambda`
|
||||
// dependency and our binary. `dependencies` is a map so we don't run the
|
||||
// risk of having 2 `now_lambda`s in there.
|
||||
await fs.writeFile(cargoTomlFile, tomlToWrite);
|
||||
|
||||
const { debug } = config;
|
||||
console.log('running `cargo build`...');
|
||||
try {
|
||||
await execa(
|
||||
'cargo',
|
||||
['build', '--bin', binName].concat(debug ? [] : ['--release']),
|
||||
{
|
||||
env: rustEnv,
|
||||
cwd: entrypointDirname,
|
||||
stdio: 'inherit',
|
||||
},
|
||||
);
|
||||
} catch (err) {
|
||||
console.error('failed to `cargo build`');
|
||||
throw err;
|
||||
}
|
||||
|
||||
const bin = path.join(
|
||||
path.dirname(cargoTomlFile),
|
||||
'target',
|
||||
debug ? 'debug' : 'release',
|
||||
binName,
|
||||
);
|
||||
|
||||
const lambda = await createLambda({
|
||||
files: {
|
||||
bootstrap: new FileFsRef({ mode: 0o755, fsPath: bin }),
|
||||
},
|
||||
handler: 'bootstrap',
|
||||
runtime: 'provided',
|
||||
});
|
||||
|
||||
return {
|
||||
[entrypoint]: lambda,
|
||||
};
|
||||
}
|
||||
|
||||
exports.build = async (m) => {
|
||||
const { files, entrypoint, workPath } = m;
|
||||
console.log('downloading files');
|
||||
const downloadedFiles = await download(files, workPath);
|
||||
|
||||
const { PATH: toolchainPath, ...otherEnv } = await installRustAndGCC();
|
||||
const { PATH, HOME } = process.env;
|
||||
const rustEnv = {
|
||||
...process.env,
|
||||
...otherEnv,
|
||||
PATH: `${path.join(HOME, '.cargo/bin')}:${toolchainPath}:${PATH}`,
|
||||
};
|
||||
|
||||
const newM = Object.assign(m, { downloadedFiles, rustEnv });
|
||||
if (path.extname(entrypoint) === '.toml') {
|
||||
return buildWholeProject(newM);
|
||||
}
|
||||
return buildSingleFile(newM);
|
||||
};
|
||||
|
||||
exports.prepareCache = async ({ cachePath, entrypoint, workPath }) => {
|
||||
console.log('preparing cache...');
|
||||
|
||||
let targetFolderDir;
|
||||
if (path.extname(entrypoint) === '.toml') {
|
||||
targetFolderDir = path.dirname(path.join(workPath, entrypoint));
|
||||
} else {
|
||||
const { PATH, HOME } = process.env;
|
||||
const rustEnv = {
|
||||
...process.env,
|
||||
PATH: `${path.join(HOME, '.cargo/bin')}:${PATH}`,
|
||||
};
|
||||
const entrypointDirname = path.dirname(path.join(workPath, entrypoint));
|
||||
const cargoTomlFile = await cargoLocateProject({
|
||||
env: rustEnv,
|
||||
cwd: entrypointDirname,
|
||||
});
|
||||
|
||||
if (cargoTomlFile != null) {
|
||||
targetFolderDir = path.dirname(cargoTomlFile);
|
||||
} else {
|
||||
// `Cargo.toml` doesn't exist, in `build` we put it in the same
|
||||
// path as the entrypoint.
|
||||
targetFolderDir = path.dirname(path.join(workPath, entrypoint));
|
||||
}
|
||||
}
|
||||
|
||||
const cacheEntrypointDirname = path.join(
|
||||
cachePath,
|
||||
path.relative(workPath, targetFolderDir),
|
||||
);
|
||||
|
||||
// Remove the target folder to avoid 'directory already exists' errors
|
||||
fs.removeSync(path.join(cacheEntrypointDirname, 'target'));
|
||||
fs.mkdirpSync(cacheEntrypointDirname);
|
||||
// Move the target folder to the cache location
|
||||
fs.renameSync(
|
||||
path.join(targetFolderDir, 'target'),
|
||||
path.join(cacheEntrypointDirname, 'target'),
|
||||
);
|
||||
|
||||
return {
|
||||
...(await glob('**/**', path.join(cachePath))),
|
||||
};
|
||||
};
|
||||
8
packages/now-rust/launcher.rs
Normal file
8
packages/now-rust/launcher.rs
Normal file
@@ -0,0 +1,8 @@
|
||||
use now_lambda::lambda;
|
||||
use std::error::Error;
|
||||
|
||||
// PLACEHOLDER
|
||||
|
||||
fn main() -> Result<(), Box<dyn Error>> {
|
||||
Ok(lambda!(handler))
|
||||
}
|
||||
22
packages/now-rust/package.json
Normal file
22
packages/now-rust/package.json
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"name": "@now/rust",
|
||||
"version": "0.0.3-canary.2",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-rust"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"download-install-rust-toolchain.js",
|
||||
"launcher.rs"
|
||||
],
|
||||
"dependencies": {
|
||||
"@iarna/toml": "^2.2.1",
|
||||
"execa": "^1.0.0",
|
||||
"fs-extra": "^7.0.1",
|
||||
"node-fetch": "^2.3.0",
|
||||
"tar": "^4.4.8"
|
||||
}
|
||||
}
|
||||
196
packages/now-rust/src/body.rs
Normal file
196
packages/now-rust/src/body.rs
Normal file
@@ -0,0 +1,196 @@
|
||||
//! Provides a Now Lambda oriented request and response body entity interface
|
||||
|
||||
use std::{borrow::Cow, ops::Deref, str};
|
||||
|
||||
use base64::display::Base64Display;
|
||||
use serde::ser::{Error as SerError, Serialize, Serializer};
|
||||
|
||||
/// Representation of http request and response bodies as supported
|
||||
/// by Zeit Now v2.
|
||||
///
|
||||
/// These come in three flavors
|
||||
/// * `Empty` ( no body )
|
||||
/// * `Text` ( text data )
|
||||
/// * `Binary` ( binary data )
|
||||
///
|
||||
/// Body types can be `Deref` and `AsRef`'d into `[u8]` types much like the `hyper` crate
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
/// Body types are inferred with `From` implementations.
|
||||
///
|
||||
/// ## Text
|
||||
///
|
||||
/// Types like `String`, `str` whose type reflects
|
||||
/// text produce `Body::Text` variants
|
||||
///
|
||||
/// ```
|
||||
/// assert!(match now_lambda::Body::from("text") {
|
||||
/// now_lambda::Body::Text(_) => true,
|
||||
/// _ => false
|
||||
/// })
|
||||
/// ```
|
||||
///
|
||||
/// ## Binary
|
||||
///
|
||||
/// Types like `Vec<u8>` and `&[u8]` whose types reflect raw bytes produce `Body::Binary` variants
|
||||
///
|
||||
/// ```
|
||||
/// assert!(match now_lambda::Body::from("text".as_bytes()) {
|
||||
/// now_lambda::Body::Binary(_) => true,
|
||||
/// _ => false
|
||||
/// })
|
||||
/// ```
|
||||
///
|
||||
/// `Binary` responses bodies will automatically get base64 encoded.
|
||||
///
|
||||
/// ## Empty
|
||||
///
|
||||
/// The unit type ( `()` ) whose type represents an empty value produces `Body::Empty` variants
|
||||
///
|
||||
/// ```
|
||||
/// assert!(match now_lambda::Body::from(()) {
|
||||
/// now_lambda::Body::Empty => true,
|
||||
/// _ => false
|
||||
/// })
|
||||
/// ```
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub enum Body {
|
||||
/// An empty body
|
||||
Empty,
|
||||
/// A body containing string data
|
||||
Text(String),
|
||||
/// A body containing binary data
|
||||
Binary(Vec<u8>),
|
||||
}
|
||||
|
||||
impl Default for Body {
|
||||
fn default() -> Self {
|
||||
Body::Empty
|
||||
}
|
||||
}
|
||||
|
||||
impl From<()> for Body {
|
||||
fn from(_: ()) -> Self {
|
||||
Body::Empty
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Body> for () {
|
||||
fn from(_: Body) -> Self {
|
||||
()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> From<&'a str> for Body {
|
||||
fn from(s: &'a str) -> Self {
|
||||
Body::Text(s.into())
|
||||
}
|
||||
}
|
||||
|
||||
impl From<String> for Body {
|
||||
fn from(b: String) -> Self {
|
||||
Body::Text(b)
|
||||
}
|
||||
}
|
||||
impl From<Body> for String {
|
||||
fn from(b: Body) -> String {
|
||||
match b {
|
||||
Body::Empty => String::from(""),
|
||||
Body::Text(t) => t,
|
||||
Body::Binary(b) => str::from_utf8(&b).unwrap().to_owned(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Cow<'static, str>> for Body {
|
||||
#[inline]
|
||||
fn from(cow: Cow<'static, str>) -> Body {
|
||||
match cow {
|
||||
Cow::Borrowed(b) => Body::from(b.to_owned()),
|
||||
Cow::Owned(o) => Body::from(o),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Body> for Cow<'static, str> {
|
||||
#[inline]
|
||||
fn from(b: Body) -> Cow<'static, str> {
|
||||
Cow::Owned(String::from(b))
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Cow<'static, [u8]>> for Body {
|
||||
#[inline]
|
||||
fn from(cow: Cow<'static, [u8]>) -> Body {
|
||||
match cow {
|
||||
Cow::Borrowed(b) => Body::from(b),
|
||||
Cow::Owned(o) => Body::from(o),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Body> for Cow<'static, [u8]> {
|
||||
#[inline]
|
||||
fn from(b: Body) -> Self {
|
||||
Cow::Owned(b.as_ref().to_owned())
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Vec<u8>> for Body {
|
||||
fn from(b: Vec<u8>) -> Self {
|
||||
Body::Binary(b)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Body> for Vec<u8> {
|
||||
fn from(b: Body) -> Self {
|
||||
match b {
|
||||
Body::Empty => "".as_bytes().to_owned(),
|
||||
Body::Text(t) => t.into_bytes(),
|
||||
Body::Binary(b) => b.to_owned(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> From<&'a [u8]> for Body {
|
||||
fn from(b: &'a [u8]) -> Self {
|
||||
Body::Binary(b.to_vec())
|
||||
}
|
||||
}
|
||||
|
||||
impl Deref for Body {
|
||||
type Target = [u8];
|
||||
|
||||
#[inline]
|
||||
fn deref(&self) -> &Self::Target {
|
||||
self.as_ref()
|
||||
}
|
||||
}
|
||||
|
||||
impl AsRef<[u8]> for Body {
|
||||
#[inline]
|
||||
fn as_ref(&self) -> &[u8] {
|
||||
match self {
|
||||
Body::Empty => &[],
|
||||
Body::Text(ref bytes) => bytes.as_ref(),
|
||||
Body::Binary(ref bytes) => bytes.as_ref(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Serialize for Body {
|
||||
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
|
||||
where
|
||||
S: Serializer,
|
||||
{
|
||||
match self {
|
||||
Body::Text(data) => serializer
|
||||
.serialize_str(::std::str::from_utf8(data.as_ref()).map_err(S::Error::custom)?),
|
||||
Body::Binary(data) => {
|
||||
serializer.collect_str(&Base64Display::with_config(data, base64::STANDARD))
|
||||
}
|
||||
Body::Empty => serializer.serialize_unit(),
|
||||
}
|
||||
}
|
||||
}
|
||||
44
packages/now-rust/src/error.rs
Normal file
44
packages/now-rust/src/error.rs
Normal file
@@ -0,0 +1,44 @@
|
||||
use http;
|
||||
use lambda_runtime::error::LambdaErrorExt;
|
||||
use std::{error::Error, fmt};
|
||||
|
||||
/// This module implements a custom error currently over the AWS Lambda runtime,
|
||||
/// which can be extended later to support more service providers.
|
||||
#[derive(Debug)]
|
||||
pub struct NowError {
|
||||
msg: String,
|
||||
}
|
||||
impl NowError {
|
||||
pub fn new(message: &str) -> NowError {
|
||||
NowError {
|
||||
msg: message.to_owned(),
|
||||
}
|
||||
}
|
||||
}
|
||||
impl fmt::Display for NowError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
write!(f, "{}", self.msg)
|
||||
}
|
||||
}
|
||||
|
||||
impl Error for NowError {}
|
||||
|
||||
impl From<std::num::ParseIntError> for NowError {
|
||||
fn from(i: std::num::ParseIntError) -> Self {
|
||||
NowError::new(&format!("{}", i))
|
||||
}
|
||||
}
|
||||
|
||||
impl From<http::Error> for NowError {
|
||||
fn from(i: http::Error) -> Self {
|
||||
NowError::new(&format!("{}", i))
|
||||
}
|
||||
}
|
||||
|
||||
// the value returned by the error_type function is included as the
|
||||
// `errorType` in the AWS Lambda response
|
||||
impl LambdaErrorExt for NowError {
|
||||
fn error_type(&self) -> &str {
|
||||
"NowError"
|
||||
}
|
||||
}
|
||||
91
packages/now-rust/src/lib.rs
Normal file
91
packages/now-rust/src/lib.rs
Normal file
@@ -0,0 +1,91 @@
|
||||
pub use http::{self, Response};
|
||||
use lambda_runtime::{self as lambda, Context};
|
||||
use log::{self, debug, error};
|
||||
use serde_json::Error;
|
||||
use tokio::runtime::Runtime as TokioRuntime;
|
||||
|
||||
mod body;
|
||||
pub mod error;
|
||||
pub mod request;
|
||||
mod response;
|
||||
mod strmap;
|
||||
|
||||
pub use crate::{body::Body, response::IntoResponse, strmap::StrMap};
|
||||
use crate::{
|
||||
error::NowError,
|
||||
request::{NowEvent, NowRequest},
|
||||
response::NowResponse,
|
||||
};
|
||||
|
||||
/// Type alias for `http::Request`s with a fixed `now_lambda::Body` body
|
||||
pub type Request = http::Request<Body>;
|
||||
|
||||
/// Functions acting as Now Lambda handlers must conform to this type.
|
||||
pub trait Handler<R, B, E> {
|
||||
/// Method to execute the handler function
|
||||
fn run(&mut self, event: http::Request<B>) -> Result<R, E>;
|
||||
}
|
||||
|
||||
impl<Function, R, B, E> Handler<R, B, E> for Function
|
||||
where
|
||||
Function: FnMut(http::Request<B>) -> Result<R, E>,
|
||||
{
|
||||
fn run(&mut self, event: http::Request<B>) -> Result<R, E> {
|
||||
(*self)(event)
|
||||
}
|
||||
}
|
||||
|
||||
/// Creates a new `lambda_runtime::Runtime` and begins polling for Now Lambda events
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `f` A type that conforms to the `Handler` interface.
|
||||
///
|
||||
/// # Panics
|
||||
/// The function panics if the Lambda environment variables are not set.
|
||||
pub fn start<R, B, E>(f: impl Handler<R, B, E>, runtime: Option<TokioRuntime>)
|
||||
where
|
||||
B: From<Body>,
|
||||
E: Into<NowError>,
|
||||
R: IntoResponse,
|
||||
{
|
||||
// handler requires a mutable ref
|
||||
let mut func = f;
|
||||
lambda::start(
|
||||
|e: NowEvent, _ctx: Context| {
|
||||
let req_str = e.body;
|
||||
let parse_result: Result<NowRequest, Error> = serde_json::from_str(&req_str);
|
||||
match parse_result {
|
||||
Ok(req) => {
|
||||
debug!("Deserialized Now proxy request successfully");
|
||||
let request: http::Request<Body> = req.into();
|
||||
func.run(request.map(|b| b.into()))
|
||||
.map(|resp| NowResponse::from(resp.into_response()))
|
||||
.map_err(|e| e.into())
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Could not deserialize event body to NowRequest {}", e);
|
||||
panic!("Could not deserialize event body to NowRequest {}", e);
|
||||
}
|
||||
}
|
||||
},
|
||||
runtime,
|
||||
)
|
||||
}
|
||||
|
||||
/// A macro for starting new handler's poll for Now Lambda events
|
||||
#[macro_export]
|
||||
macro_rules! lambda {
|
||||
($handler:expr) => {
|
||||
$crate::start($handler, None)
|
||||
};
|
||||
($handler:expr, $runtime:expr) => {
|
||||
$crate::start($handler, Some($runtime))
|
||||
};
|
||||
($handler:ident) => {
|
||||
$crate::start($handler, None)
|
||||
};
|
||||
($handler:ident, $runtime:expr) => {
|
||||
$crate::start($handler, Some($runtime))
|
||||
};
|
||||
}
|
||||
122
packages/now-rust/src/request.rs
Normal file
122
packages/now-rust/src/request.rs
Normal file
@@ -0,0 +1,122 @@
|
||||
use std::{borrow::Cow, fmt, mem};
|
||||
|
||||
use http::{self, header::HeaderValue, HeaderMap, Method, Request as HttpRequest};
|
||||
use serde::de::{Deserializer, Error as DeError, MapAccess, Visitor};
|
||||
use serde_derive::Deserialize;
|
||||
|
||||
use crate::body::Body;
|
||||
|
||||
/// Representation of a Now Lambda proxy event data
|
||||
#[doc(hidden)]
|
||||
#[derive(Deserialize, Debug, Default)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub(crate) struct NowRequest<'a> {
|
||||
pub(crate) host: Cow<'a, str>,
|
||||
pub(crate) path: Cow<'a, str>,
|
||||
#[serde(deserialize_with = "deserialize_method")]
|
||||
pub(crate) method: Method,
|
||||
#[serde(deserialize_with = "deserialize_headers")]
|
||||
pub(crate) headers: HeaderMap<HeaderValue>,
|
||||
pub(crate) body: Option<Cow<'a, str>>,
|
||||
pub(crate) encoding: Option<String>,
|
||||
}
|
||||
|
||||
#[doc(hidden)]
|
||||
#[derive(Deserialize, Debug, Default)]
|
||||
pub(crate) struct NowEvent<'a> {
|
||||
#[serde(rename = "Action")]
|
||||
action: Cow<'a, str>,
|
||||
pub(crate) body: Cow<'a, str>,
|
||||
}
|
||||
|
||||
fn deserialize_method<'de, D>(deserializer: D) -> Result<Method, D::Error>
|
||||
where
|
||||
D: Deserializer<'de>,
|
||||
{
|
||||
struct MethodVisitor;
|
||||
|
||||
impl<'de> Visitor<'de> for MethodVisitor {
|
||||
type Value = Method;
|
||||
|
||||
fn expecting(&self, formatter: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(formatter, "a Method")
|
||||
}
|
||||
|
||||
fn visit_str<E>(self, v: &str) -> Result<Self::Value, E>
|
||||
where
|
||||
E: DeError,
|
||||
{
|
||||
v.parse().map_err(E::custom)
|
||||
}
|
||||
}
|
||||
|
||||
deserializer.deserialize_str(MethodVisitor)
|
||||
}
|
||||
|
||||
fn deserialize_headers<'de, D>(deserializer: D) -> Result<HeaderMap<HeaderValue>, D::Error>
|
||||
where
|
||||
D: Deserializer<'de>,
|
||||
{
|
||||
struct HeaderVisitor;
|
||||
|
||||
impl<'de> Visitor<'de> for HeaderVisitor {
|
||||
type Value = HeaderMap<HeaderValue>;
|
||||
|
||||
fn expecting(&self, formatter: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(formatter, "a HeaderMap<HeaderValue>")
|
||||
}
|
||||
|
||||
fn visit_map<A>(self, mut map: A) -> Result<Self::Value, A::Error>
|
||||
where
|
||||
A: MapAccess<'de>,
|
||||
{
|
||||
let mut headers = http::HeaderMap::new();
|
||||
while let Some((key, value)) = map.next_entry::<Cow<'_, str>, Cow<'_, str>>()? {
|
||||
let header_name = key
|
||||
.parse::<http::header::HeaderName>()
|
||||
.map_err(A::Error::custom)?;
|
||||
let header_value =
|
||||
http::header::HeaderValue::from_shared(value.into_owned().into())
|
||||
.map_err(A::Error::custom)?;
|
||||
headers.append(header_name, header_value);
|
||||
}
|
||||
Ok(headers)
|
||||
}
|
||||
}
|
||||
|
||||
deserializer.deserialize_map(HeaderVisitor)
|
||||
}
|
||||
|
||||
impl<'a> From<NowRequest<'a>> for HttpRequest<Body> {
|
||||
fn from(value: NowRequest<'_>) -> Self {
|
||||
let NowRequest {
|
||||
host,
|
||||
path,
|
||||
method,
|
||||
headers,
|
||||
body,
|
||||
encoding,
|
||||
} = value;
|
||||
|
||||
// build an http::Request<now_lambda::Body> from a now_lambda::NowRequest
|
||||
let mut builder = HttpRequest::builder();
|
||||
builder.method(method);
|
||||
builder.uri({ format!("https://{}{}", host, path) });
|
||||
|
||||
let mut req = builder
|
||||
.body(match (body, encoding) {
|
||||
(Some(ref b), Some(ref encoding)) if encoding == "base64" => {
|
||||
// todo: document failure behavior
|
||||
Body::from(::base64::decode(b.as_ref()).unwrap_or_default())
|
||||
}
|
||||
(Some(b), Some(_)) => Body::from(b.into_owned()),
|
||||
_ => Body::from(()),
|
||||
})
|
||||
.expect("failed to build request");
|
||||
|
||||
// no builder method that sets headers in batch
|
||||
mem::replace(req.headers_mut(), headers);
|
||||
|
||||
req
|
||||
}
|
||||
}
|
||||
122
packages/now-rust/src/response.rs
Normal file
122
packages/now-rust/src/response.rs
Normal file
@@ -0,0 +1,122 @@
|
||||
//! Response types
|
||||
|
||||
use http::{
|
||||
header::{HeaderMap, HeaderValue},
|
||||
Response,
|
||||
};
|
||||
use serde::ser::{Error as SerError, SerializeMap, Serializer};
|
||||
use serde_derive::Serialize;
|
||||
|
||||
use crate::body::Body;
|
||||
|
||||
/// Representation of a Now Lambda response
|
||||
#[derive(Serialize, Debug)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub(crate) struct NowResponse {
|
||||
pub status_code: u16,
|
||||
#[serde(
|
||||
skip_serializing_if = "HeaderMap::is_empty",
|
||||
serialize_with = "serialize_headers"
|
||||
)]
|
||||
pub headers: HeaderMap<HeaderValue>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub body: Option<Body>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub encoding: Option<String>,
|
||||
}
|
||||
|
||||
impl Default for NowResponse {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
status_code: 200,
|
||||
headers: Default::default(),
|
||||
body: Default::default(),
|
||||
encoding: Default::default(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn serialize_headers<S>(headers: &HeaderMap<HeaderValue>, serializer: S) -> Result<S::Ok, S::Error>
|
||||
where
|
||||
S: Serializer,
|
||||
{
|
||||
let mut map = serializer.serialize_map(Some(headers.keys_len()))?;
|
||||
for key in headers.keys() {
|
||||
let map_value = headers[key].to_str().map_err(S::Error::custom)?;
|
||||
map.serialize_entry(key.as_str(), map_value)?;
|
||||
}
|
||||
map.end()
|
||||
}
|
||||
|
||||
impl<T> From<Response<T>> for NowResponse
|
||||
where
|
||||
T: Into<Body>,
|
||||
{
|
||||
fn from(value: Response<T>) -> Self {
|
||||
let (parts, bod) = value.into_parts();
|
||||
let (encoding, body) = match bod.into() {
|
||||
Body::Empty => (None, None),
|
||||
b @ Body::Text(_) => (None, Some(b)),
|
||||
b @ Body::Binary(_) => (Some("base64".to_string()), Some(b)),
|
||||
};
|
||||
NowResponse {
|
||||
status_code: parts.status.as_u16(),
|
||||
body,
|
||||
headers: parts.headers,
|
||||
encoding,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A conversion of self into a `Response`
|
||||
///
|
||||
/// Implementations for `Response<B> where B: Into<Body>`,
|
||||
/// `B where B: Into<Body>` and `serde_json::Value` are provided
|
||||
/// by default
|
||||
///
|
||||
/// # example
|
||||
///
|
||||
/// ```rust
|
||||
/// use now_lambda::{Body, IntoResponse, Response};
|
||||
///
|
||||
/// assert_eq!(
|
||||
/// "hello".into_response().body(),
|
||||
/// Response::new(Body::from("hello")).body()
|
||||
/// );
|
||||
/// ```
|
||||
pub trait IntoResponse {
|
||||
/// Return a translation of `self` into a `Response<Body>`
|
||||
fn into_response(self) -> Response<Body>;
|
||||
}
|
||||
|
||||
impl<B> IntoResponse for Response<B>
|
||||
where
|
||||
B: Into<Body>,
|
||||
{
|
||||
fn into_response(self) -> Response<Body> {
|
||||
let (parts, body) = self.into_parts();
|
||||
Response::from_parts(parts, body.into())
|
||||
}
|
||||
}
|
||||
|
||||
impl<B> IntoResponse for B
|
||||
where
|
||||
B: Into<Body>,
|
||||
{
|
||||
fn into_response(self) -> Response<Body> {
|
||||
Response::new(self.into())
|
||||
}
|
||||
}
|
||||
|
||||
impl IntoResponse for serde_json::Value {
|
||||
fn into_response(self) -> Response<Body> {
|
||||
Response::builder()
|
||||
.header(http::header::CONTENT_TYPE, "application/json")
|
||||
.body(
|
||||
serde_json::to_string(&self)
|
||||
.expect("unable to serialize serde_json::Value")
|
||||
.into(),
|
||||
)
|
||||
.expect("unable to build http::Response")
|
||||
}
|
||||
}
|
||||
90
packages/now-rust/src/strmap.rs
Normal file
90
packages/now-rust/src/strmap.rs
Normal file
@@ -0,0 +1,90 @@
|
||||
use std::{
|
||||
collections::{hash_map::Keys, HashMap},
|
||||
fmt,
|
||||
sync::Arc,
|
||||
};
|
||||
|
||||
use serde::de::{Deserialize, Deserializer, MapAccess, Visitor};
|
||||
|
||||
/// A read-only view into a map of string data
|
||||
#[derive(Default, Debug, PartialEq)]
|
||||
pub struct StrMap(pub(crate) Arc<HashMap<String, String>>);
|
||||
|
||||
impl StrMap {
|
||||
/// Return a named value where available
|
||||
pub fn get(&self, key: &str) -> Option<&str> {
|
||||
self.0.get(key).map(|value| value.as_ref())
|
||||
}
|
||||
|
||||
/// Return true if the underlying map is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.0.is_empty()
|
||||
}
|
||||
|
||||
/// Return an iterator over keys and values
|
||||
pub fn iter(&self) -> StrMapIter<'_> {
|
||||
StrMapIter {
|
||||
data: self,
|
||||
keys: self.0.keys(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Clone for StrMap {
|
||||
fn clone(&self) -> Self {
|
||||
// only clone the inner data
|
||||
StrMap(self.0.clone())
|
||||
}
|
||||
}
|
||||
impl From<HashMap<String, String>> for StrMap {
|
||||
fn from(inner: HashMap<String, String>) -> Self {
|
||||
StrMap(Arc::new(inner))
|
||||
}
|
||||
}
|
||||
|
||||
/// A read only reference to `StrMap` key and value slice pairings
|
||||
pub struct StrMapIter<'a> {
|
||||
data: &'a StrMap,
|
||||
keys: Keys<'a, String, String>,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for StrMapIter<'a> {
|
||||
type Item = (&'a str, &'a str);
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<(&'a str, &'a str)> {
|
||||
self.keys
|
||||
.next()
|
||||
.and_then(|k| self.data.get(k).map(|v| (k.as_str(), v)))
|
||||
}
|
||||
}
|
||||
|
||||
impl<'de> Deserialize<'de> for StrMap {
|
||||
fn deserialize<D>(deserializer: D) -> Result<StrMap, D::Error>
|
||||
where
|
||||
D: Deserializer<'de>,
|
||||
{
|
||||
struct StrMapVisitor;
|
||||
|
||||
impl<'de> Visitor<'de> for StrMapVisitor {
|
||||
type Value = StrMap;
|
||||
|
||||
fn expecting(&self, formatter: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(formatter, "a StrMap")
|
||||
}
|
||||
|
||||
fn visit_map<A>(self, mut map: A) -> Result<Self::Value, A::Error>
|
||||
where
|
||||
A: MapAccess<'de>,
|
||||
{
|
||||
let mut inner = HashMap::new();
|
||||
while let Some((key, value)) = map.next_entry()? {
|
||||
inner.insert(key, value);
|
||||
}
|
||||
Ok(StrMap(Arc::new(inner)))
|
||||
}
|
||||
}
|
||||
|
||||
deserializer.deserialize_map(StrMapVisitor)
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,21 @@
|
||||
const download = require('@now/build-utils/fs/download.js');
|
||||
const glob = require('@now/build-utils/fs/glob.js');
|
||||
const download = require('@now/build-utils/fs/download.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const glob = require('@now/build-utils/fs/glob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const path = require('path');
|
||||
const { existsSync } = require('fs');
|
||||
const {
|
||||
runNpmInstall,
|
||||
runPackageJsonScript,
|
||||
runShellScript,
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js');
|
||||
} = require('@now/build-utils/fs/run-user-scripts.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
function validateDistDir(distDir) {
|
||||
const distDirName = path.basename(distDir);
|
||||
if (!existsSync(distDir)) {
|
||||
const message = `Build was unable to create the distDir: ${distDirName}.`
|
||||
+ '\nMake sure you mentioned the correct dist directory: https://zeit.co/docs/v2/deployments/official-builders/static-build-now-static-build/#configuring-the-dist-directory';
|
||||
throw new Error(message);
|
||||
}
|
||||
}
|
||||
|
||||
exports.build = async ({
|
||||
files, entrypoint, workPath, config,
|
||||
@@ -24,6 +34,7 @@ exports.build = async ({
|
||||
if (path.basename(entrypoint) === 'package.json') {
|
||||
await runNpmInstall(entrypointFsDirname, ['--prefer-offline']);
|
||||
if (await runPackageJsonScript(entrypointFsDirname, 'now-build')) {
|
||||
validateDistDir(distPath);
|
||||
return glob('**', distPath, mountpoint);
|
||||
}
|
||||
throw new Error(`An error running "now-build" script in "${entrypoint}"`);
|
||||
@@ -31,6 +42,7 @@ exports.build = async ({
|
||||
|
||||
if (path.extname(entrypoint) === '.sh') {
|
||||
await runShellScript(path.join(workPath, entrypoint));
|
||||
validateDistDir(distPath);
|
||||
return glob('**', distPath, mountpoint);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
{
|
||||
"name": "@now/static-build",
|
||||
"version": "0.4.17",
|
||||
"version": "0.4.19-canary.1",
|
||||
"license": "MIT",
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-static-build"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
|
||||
6
packages/now-static-build/test/fixtures/04-wrong-dist-dir/now.json
vendored
Normal file
6
packages/now-static-build/test/fixtures/04-wrong-dist-dir/now.json
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
{
|
||||
"version": 2,
|
||||
"builds": [
|
||||
{ "src": "package.json", "use": "@now/static-build", "config": {"distDir": "out"} }
|
||||
]
|
||||
}
|
||||
8
packages/now-static-build/test/fixtures/04-wrong-dist-dir/package.json
vendored
Normal file
8
packages/now-static-build/test/fixtures/04-wrong-dist-dir/package.json
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"dependencies": {
|
||||
"cowsay": "^1.3.1"
|
||||
},
|
||||
"scripts": {
|
||||
"now-build": "mkdir dist && cowsay cow:RANDOMNESS_PLACEHOLDER > dist/index.txt"
|
||||
}
|
||||
}
|
||||
8
packages/now-static-build/test/fixtures/04-wrong-dist-dir/subdirectory/package.json
vendored
Normal file
8
packages/now-static-build/test/fixtures/04-wrong-dist-dir/subdirectory/package.json
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"dependencies": {
|
||||
"yodasay": "^1.1.6"
|
||||
},
|
||||
"scripts": {
|
||||
"now-build": "mkdir dist && yodasay yoda:RANDOMNESS_PLACEHOLDER > dist/index.txt"
|
||||
}
|
||||
}
|
||||
@@ -7,7 +7,7 @@ const {
|
||||
testDeployment,
|
||||
} = require('../../../test/lib/deployment/test-deployment.js');
|
||||
|
||||
jest.setTimeout(2 * 60 * 1000);
|
||||
jest.setTimeout(4 * 60 * 1000);
|
||||
const buildUtilsUrl = '@canary';
|
||||
let builderUrl;
|
||||
|
||||
@@ -21,6 +21,21 @@ const fixturesPath = path.resolve(__dirname, 'fixtures');
|
||||
|
||||
// eslint-disable-next-line no-restricted-syntax
|
||||
for (const fixture of fs.readdirSync(fixturesPath)) {
|
||||
if (fixture === '04-wrong-dist-dir') {
|
||||
// eslint-disable-next-line no-loop-func
|
||||
it(`should not build ${fixture}`, async () => {
|
||||
try {
|
||||
await testDeployment(
|
||||
{ builderUrl, buildUtilsUrl },
|
||||
path.join(fixturesPath, fixture),
|
||||
);
|
||||
} catch (err) {
|
||||
expect(err.message).toMatch(/is ERROR/);
|
||||
}
|
||||
});
|
||||
continue; //eslint-disable-line
|
||||
}
|
||||
|
||||
// eslint-disable-next-line no-loop-func
|
||||
it(`should build ${fixture}`, async () => {
|
||||
await expect(
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
const assert = require('assert');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js');
|
||||
const { createLambda } = require('@now/build-utils/lambda.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const fetch = require('node-fetch');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js');
|
||||
const FileBlob = require('@now/build-utils/file-blob.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const { getFiles } = require('@now/php-bridge');
|
||||
const path = require('path');
|
||||
const rename = require('@now/build-utils/fs/rename.js');
|
||||
const streamToBuffer = require('@now/build-utils/fs/stream-to-buffer.js');
|
||||
const rename = require('@now/build-utils/fs/rename.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const streamToBuffer = require('@now/build-utils/fs/stream-to-buffer.js'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
const yauzl = require('yauzl');
|
||||
|
||||
exports.config = {
|
||||
@@ -16,7 +16,9 @@ async function readReleaseUrl(releaseUrl) {
|
||||
const resp = await fetch(releaseUrl);
|
||||
|
||||
if (!resp.ok) {
|
||||
throw new Error(`Failed to download ${releaseUrl}. Status code is ${resp.status}`);
|
||||
throw new Error(
|
||||
`Failed to download ${releaseUrl}. Status code is ${resp.status}`,
|
||||
);
|
||||
}
|
||||
|
||||
return resp.buffer();
|
||||
@@ -49,13 +51,15 @@ function decompressBuffer(buffer, mountpoint) {
|
||||
return;
|
||||
}
|
||||
|
||||
streamToBuffer(readStream).then((data) => {
|
||||
assert(prefixRegexp.test(fileName), fileName);
|
||||
const fileName2 = fileName.replace(prefixRegexp, '');
|
||||
const fileName3 = path.join(mountpoint, fileName2);
|
||||
files[fileName3] = new FileBlob({ data });
|
||||
zipfile.readEntry();
|
||||
}).catch(reject);
|
||||
streamToBuffer(readStream)
|
||||
.then((data) => {
|
||||
assert(prefixRegexp.test(fileName), fileName);
|
||||
const fileName2 = fileName.replace(prefixRegexp, '');
|
||||
const fileName3 = path.join(mountpoint, fileName2);
|
||||
files[fileName3] = new FileBlob({ data });
|
||||
zipfile.readEntry();
|
||||
})
|
||||
.catch(reject);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -65,12 +69,22 @@ function decompressBuffer(buffer, mountpoint) {
|
||||
}
|
||||
|
||||
const staticRegexps = [
|
||||
/\.css$/, /\.gif$/, /\.ico$/, /\.js$/, /\.jpg$/, /\.png$/, /\.svg$/, /\.woff$/, /\.woff2$/,
|
||||
/\.css$/,
|
||||
/\.gif$/,
|
||||
/\.ico$/,
|
||||
/\.js$/,
|
||||
/\.jpg$/,
|
||||
/\.png$/,
|
||||
/\.svg$/,
|
||||
/\.woff$/,
|
||||
/\.woff2$/,
|
||||
];
|
||||
|
||||
exports.build = async ({ files, entrypoint, config }) => {
|
||||
if (path.basename(entrypoint) !== 'wp-config.php') {
|
||||
throw new Error(`Entrypoint file name must be "wp-config.php". Currently it is ${entrypoint}`);
|
||||
throw new Error(
|
||||
`Entrypoint file name must be "wp-config.php". Currently it is ${entrypoint}`,
|
||||
);
|
||||
}
|
||||
|
||||
const { releaseUrl = 'https://wordpress.org/latest.zip' } = config;
|
||||
@@ -84,9 +98,12 @@ exports.build = async ({ files, entrypoint, config }) => {
|
||||
if (config.patchForPersistentConnections) {
|
||||
const wpDbPhp = path.join(mountpoint, 'wp-includes/wp-db.php');
|
||||
const wpDbPhpBlob = mergedFiles[wpDbPhp];
|
||||
wpDbPhpBlob.data = wpDbPhpBlob.data.toString()
|
||||
.replace(/mysqli_real_connect\( \$this->dbh, \$host,/g,
|
||||
'mysqli_real_connect( $this->dbh, \'p:\' . $host,');
|
||||
wpDbPhpBlob.data = wpDbPhpBlob.data
|
||||
.toString()
|
||||
.replace(
|
||||
/mysqli_real_connect\( \$this->dbh, \$host,/g,
|
||||
"mysqli_real_connect( $this->dbh, 'p:' . $host,",
|
||||
);
|
||||
}
|
||||
|
||||
const staticFiles = {};
|
||||
|
||||
@@ -1,15 +1,17 @@
|
||||
{
|
||||
"name": "@now/wordpress",
|
||||
"version": "0.4.14",
|
||||
"version": "0.4.15-canary.0",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/zeit/now-builders.git",
|
||||
"directory": "packages/now-wordpress"
|
||||
},
|
||||
"dependencies": {
|
||||
"@now/php-bridge": "^0.4.13",
|
||||
"@now/php-bridge": "^0.4.14-canary.0",
|
||||
"node-fetch": "2.3.0",
|
||||
"yauzl": "2.10.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@now/build-utils": ">=0.0.1"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "jest"
|
||||
}
|
||||
|
||||
1
test/integration/now-next/monorepo/shared/hello.js
Normal file
1
test/integration/now-next/monorepo/shared/hello.js
Normal file
@@ -0,0 +1 @@
|
||||
module.exports = () => 'Hello!';
|
||||
@@ -1,9 +1,9 @@
|
||||
{
|
||||
"name": "monorepo",
|
||||
"dependencies": {
|
||||
"next": "^8.0.0-canary.2",
|
||||
"react": "^16.7.0",
|
||||
"react-dom": "^16.7.0"
|
||||
"next": "^8.0.0",
|
||||
"react": "^16.8.0",
|
||||
"react-dom": "^16.8.0"
|
||||
},
|
||||
"scripts": {
|
||||
"now-build": "next build"
|
||||
|
||||
@@ -1 +1,3 @@
|
||||
export default () => 'Index page';
|
||||
import hello from '../../shared/hello';
|
||||
|
||||
export default () => `${hello()} Welcome to the index page`;
|
||||
|
||||
@@ -18,8 +18,9 @@ async function nowDeploy (bodies, randomness) {
|
||||
|
||||
const nowDeployPayload = {
|
||||
version: 2,
|
||||
env: Object.assign({}, nowJson.env, { RANDOMNESS_ENV_VAR: randomness }),
|
||||
build: { env: { RANDOMNESS_BUILD_ENV_VAR: randomness } },
|
||||
public: true,
|
||||
env: { ...nowJson.env, RANDOMNESS_ENV_VAR: randomness },
|
||||
build: { env: { ...(nowJson.build || {}).env, RANDOMNESS_BUILD_ENV_VAR: randomness } },
|
||||
name: 'test',
|
||||
files,
|
||||
builds: nowJson.builds,
|
||||
|
||||
@@ -22,7 +22,7 @@ async function packAndDeploy (builderPath) {
|
||||
|
||||
const RANDOMNESS_PLACEHOLDER_STRING = 'RANDOMNESS_PLACEHOLDER';
|
||||
|
||||
async function testDeployment ({ builderUrl, buildUtilsUrl }, fixturePath) {
|
||||
async function testDeployment ({ builderUrl, buildUtilsUrl }, fixturePath, buildDelegate) {
|
||||
console.log('testDeployment', fixturePath);
|
||||
const globResult = await glob(`${fixturePath}/**`, { nodir: true });
|
||||
const bodies = globResult.reduce((b, f) => {
|
||||
@@ -63,6 +63,10 @@ async function testDeployment ({ builderUrl, buildUtilsUrl }, fixturePath) {
|
||||
config.useBuildUtils = `https://${buildUtilsUrl}`;
|
||||
}
|
||||
}
|
||||
|
||||
if (buildDelegate) {
|
||||
buildDelegate(build);
|
||||
}
|
||||
}
|
||||
|
||||
bodies['now.json'] = Buffer.from(JSON.stringify(nowJson));
|
||||
@@ -84,7 +88,8 @@ async function testDeployment ({ builderUrl, buildUtilsUrl }, fixturePath) {
|
||||
if (!text.includes(probe.mustContain)) {
|
||||
await fs.writeFile(path.join(__dirname, 'failed-page.txt'), text);
|
||||
throw new Error(
|
||||
`Fetched page ${probeUrl} does not contain ${probe.mustContain}`
|
||||
`Fetched page ${probeUrl} does not contain ${probe.mustContain}.`
|
||||
+ ` Instead it contains ${text.slice(0, 60)}`
|
||||
);
|
||||
}
|
||||
} else {
|
||||
@@ -113,7 +118,8 @@ async function fetchDeploymentUrl (url, opts) {
|
||||
for (let i = 0; i < 500; i += 1) {
|
||||
const resp = await fetch(url, opts);
|
||||
const text = await resp.text();
|
||||
if (text && !text.includes('Join Free')) {
|
||||
if (text && !text.includes('Join Free')
|
||||
&& !text.includes('The page could not be found')) {
|
||||
return text;
|
||||
}
|
||||
|
||||
|
||||
@@ -2,13 +2,9 @@ const {
|
||||
excludeFiles,
|
||||
validateEntrypoint,
|
||||
includeOnlyEntryDirectory,
|
||||
moveEntryDirectoryToRoot,
|
||||
excludeLockFiles,
|
||||
normalizePackageJson,
|
||||
excludeStaticDirectory,
|
||||
onlyStaticDirectory,
|
||||
} = require('@now/next/utils');
|
||||
const FileRef = require('@now/build-utils/file-ref');
|
||||
const FileRef = require('@now/build-utils/file-ref'); // eslint-disable-line import/no-extraneous-dependencies
|
||||
|
||||
describe('excludeFiles', () => {
|
||||
it('should exclude files', () => {
|
||||
@@ -46,7 +42,7 @@ describe('validateEntrypoint', () => {
|
||||
});
|
||||
|
||||
describe('includeOnlyEntryDirectory', () => {
|
||||
it('should exclude files outside entry directory', () => {
|
||||
it('should include files outside entry directory', () => {
|
||||
const entryDirectory = 'frontend';
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
@@ -58,149 +54,6 @@ describe('includeOnlyEntryDirectory', () => {
|
||||
expect(result['package.json']).toBeUndefined();
|
||||
expect(result['package-lock.json']).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should handle entry directory being dot', () => {
|
||||
const entryDirectory = '.';
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'package-lock.json': new FileRef({ digest: 'package-lock' }),
|
||||
};
|
||||
const result = includeOnlyEntryDirectory(files, entryDirectory);
|
||||
expect(result['frontend/pages/index.js']).toBeDefined();
|
||||
expect(result['package.json']).toBeDefined();
|
||||
expect(result['package-lock.json']).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('moveEntryDirectoryToRoot', () => {
|
||||
it('should move entrydirectory files to the root', () => {
|
||||
const entryDirectory = 'frontend';
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
};
|
||||
const result = moveEntryDirectoryToRoot(files, entryDirectory);
|
||||
expect(result['pages/index.js']).toBeDefined();
|
||||
});
|
||||
|
||||
it('should work with deep nested subdirectories', () => {
|
||||
const entryDirectory = 'frontend/my/app';
|
||||
const files = {
|
||||
'frontend/my/app/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
};
|
||||
const result = moveEntryDirectoryToRoot(files, entryDirectory);
|
||||
expect(result['pages/index.js']).toBeDefined();
|
||||
});
|
||||
|
||||
it('should do nothing when entry directory is dot', () => {
|
||||
const entryDirectory = '.';
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
};
|
||||
const result = moveEntryDirectoryToRoot(files, entryDirectory);
|
||||
expect(result['frontend/pages/index.js']).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('excludeLockFiles', () => {
|
||||
it('should remove package-lock.json', () => {
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'package-lock.json': new FileRef({ digest: 'package-lock' }),
|
||||
};
|
||||
const result = excludeLockFiles(files);
|
||||
expect(result['frontend/pages/index.js']).toBeDefined();
|
||||
expect(result['package-lock.json']).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should remove yarn.lock', () => {
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
|
||||
};
|
||||
const result = excludeLockFiles(files);
|
||||
expect(result['frontend/pages/index.js']).toBeDefined();
|
||||
expect(result['yarn.lock']).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should remove both package-lock.json and yarn.lock', () => {
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
|
||||
'package-lock.json': new FileRef({ digest: 'package-lock' }),
|
||||
};
|
||||
const result = excludeLockFiles(files);
|
||||
expect(result['frontend/pages/index.js']).toBeDefined();
|
||||
expect(result['yarn.lock']).toBeUndefined();
|
||||
expect(result['package-lock.json']).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('excludeStaticDirectory', () => {
|
||||
it('should remove the /static directory files', () => {
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
|
||||
'package-lock.json': new FileRef({ digest: 'package-lock' }),
|
||||
'static/image.png': new FileRef({ digest: 'image' }),
|
||||
};
|
||||
const result = excludeStaticDirectory(files);
|
||||
expect(result['frontend/pages/index.js']).toBeDefined();
|
||||
expect(result['yarn.lock']).toBeDefined();
|
||||
expect(result['package-lock.json']).toBeDefined();
|
||||
expect(result['static/image.png']).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should remove the nested /static directory files', () => {
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
|
||||
'package-lock.json': new FileRef({ digest: 'package-lock' }),
|
||||
'static/images/png/image.png': new FileRef({ digest: 'image' }),
|
||||
};
|
||||
const result = excludeStaticDirectory(files);
|
||||
expect(result['frontend/pages/index.js']).toBeDefined();
|
||||
expect(result['yarn.lock']).toBeDefined();
|
||||
expect(result['package-lock.json']).toBeDefined();
|
||||
expect(result['static/images/png/image.png']).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('onlyStaticDirectory', () => {
|
||||
it('should keep only /static directory files', () => {
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
|
||||
'package-lock.json': new FileRef({ digest: 'package-lock' }),
|
||||
'static/image.png': new FileRef({ digest: 'image' }),
|
||||
};
|
||||
const result = onlyStaticDirectory(files);
|
||||
expect(result['frontend/pages/index.js']).toBeUndefined();
|
||||
expect(result['yarn.lock']).toBeUndefined();
|
||||
expect(result['package-lock.json']).toBeUndefined();
|
||||
expect(result['static/image.png']).toBeDefined();
|
||||
});
|
||||
|
||||
it('should keep nested /static directory files', () => {
|
||||
const files = {
|
||||
'frontend/pages/index.js': new FileRef({ digest: 'index' }),
|
||||
'package.json': new FileRef({ digest: 'package' }),
|
||||
'yarn.lock': new FileRef({ digest: 'yarn-lock' }),
|
||||
'package-lock.json': new FileRef({ digest: 'package-lock' }),
|
||||
'static/images/png/image.png': new FileRef({ digest: 'image' }),
|
||||
};
|
||||
const result = onlyStaticDirectory(files);
|
||||
expect(result['frontend/pages/index.js']).toBeUndefined();
|
||||
expect(result['yarn.lock']).toBeUndefined();
|
||||
expect(result['package-lock.json']).toBeUndefined();
|
||||
expect(result['static/images/png/image.png']).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('normalizePackageJson', () => {
|
||||
|
||||
41
yarn.lock
41
yarn.lock
@@ -32,6 +32,11 @@
|
||||
log-update "^2.3.0"
|
||||
strip-ansi "^3.0.1"
|
||||
|
||||
"@iarna/toml@^2.2.1":
|
||||
version "2.2.1"
|
||||
resolved "https://registry.yarnpkg.com/@iarna/toml/-/toml-2.2.1.tgz#82d0993d8882bb05e0645fbb4731d9e939e895b3"
|
||||
integrity sha512-I2EjI9TbEFJNLziNPFfpo64PNanOaK17iL2kTW/jGlGOa4bvHw4VEied83kOEB7NJjXf1KfvmsQ2aEjy3xjiGg==
|
||||
|
||||
"@lerna/add@^3.5.0":
|
||||
version "3.5.0"
|
||||
resolved "https://registry.yarnpkg.com/@lerna/add/-/add-3.5.0.tgz#3518b3d4afc3743b7227b1ee3534114eb9575888"
|
||||
@@ -2056,6 +2061,13 @@ debug@^4.0.1:
|
||||
dependencies:
|
||||
ms "^2.1.1"
|
||||
|
||||
debug@^4.1.1:
|
||||
version "4.1.1"
|
||||
resolved "https://registry.yarnpkg.com/debug/-/debug-4.1.1.tgz#3b72260255109c6b589cee050f1d516139664791"
|
||||
integrity sha512-pYAIzeRo8J6KPEaJ0VWOh5Pzkbw/RetuzehGM7QRRX5he4fPHx2rdKMB256ehJCkX+XRQm16eZLqLNS8RSZXZw==
|
||||
dependencies:
|
||||
ms "^2.1.1"
|
||||
|
||||
debuglog@^1.0.1:
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/debuglog/-/debuglog-1.0.1.tgz#aa24ffb9ac3df9a2351837cfb2d279360cd78492"
|
||||
@@ -4847,13 +4859,6 @@ kleur@^2.0.1:
|
||||
resolved "https://registry.yarnpkg.com/kleur/-/kleur-2.0.2.tgz#b704f4944d95e255d038f0cb05fb8a602c55a300"
|
||||
integrity sha512-77XF9iTllATmG9lSlIv0qdQ2BQ/h9t0bJllHlbvsQ0zUWfU7Yi0S8L5JXzPZgkefIiajLmBJJ4BsMJmqcf7oxQ==
|
||||
|
||||
lambda-git@^0.1.2:
|
||||
version "0.1.2"
|
||||
resolved "https://registry.yarnpkg.com/lambda-git/-/lambda-git-0.1.2.tgz#32ee82a47d7fc4e9c05ab03ae22b5572d8e642e4"
|
||||
integrity sha1-Mu6CpH1/xOnAWrA64itVctjmQuQ=
|
||||
dependencies:
|
||||
tar-fs "^1.14.0"
|
||||
|
||||
lazy-req@^1.0.0:
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/lazy-req/-/lazy-req-1.1.0.tgz#bdaebead30f8d824039ce0ce149d4daa07ba1fac"
|
||||
@@ -6545,14 +6550,6 @@ psl@^1.1.24, psl@^1.1.28:
|
||||
resolved "https://registry.yarnpkg.com/psl/-/psl-1.1.29.tgz#60f580d360170bb722a797cc704411e6da850c67"
|
||||
integrity sha512-AeUmQ0oLN02flVHXWh9sSJF7mcdFq0ppid/JkErufc3hGIV/AMa8Fo9VgDo/cT2jFdOWoFvHp90qqBH54W+gjQ==
|
||||
|
||||
pump@^1.0.0:
|
||||
version "1.0.3"
|
||||
resolved "https://registry.yarnpkg.com/pump/-/pump-1.0.3.tgz#5dfe8311c33bbf6fc18261f9f34702c47c08a954"
|
||||
integrity sha512-8k0JupWme55+9tCVE+FS5ULT3K6AbgqrGa58lTT49RpyfwwcGedHqaC5LlQNdEAumn/wFsu6aPwkuPMioy8kqw==
|
||||
dependencies:
|
||||
end-of-stream "^1.1.0"
|
||||
once "^1.3.1"
|
||||
|
||||
pump@^2.0.0:
|
||||
version "2.0.1"
|
||||
resolved "https://registry.yarnpkg.com/pump/-/pump-2.0.1.tgz#12399add6e4cf7526d973cbc8b5ce2e2908b3909"
|
||||
@@ -7701,17 +7698,7 @@ table@^5.0.2:
|
||||
slice-ansi "1.0.0"
|
||||
string-width "^2.1.1"
|
||||
|
||||
tar-fs@^1.14.0:
|
||||
version "1.16.3"
|
||||
resolved "https://registry.yarnpkg.com/tar-fs/-/tar-fs-1.16.3.tgz#966a628841da2c4010406a82167cbd5e0c72d509"
|
||||
integrity sha512-NvCeXpYx7OsmOh8zIOP/ebG55zZmxLE0etfWRbWok+q2Qo8x/vOR/IJT1taADXPe+jsiu9axDb3X4B+iIgNlKw==
|
||||
dependencies:
|
||||
chownr "^1.0.1"
|
||||
mkdirp "^0.5.1"
|
||||
pump "^1.0.0"
|
||||
tar-stream "^1.1.2"
|
||||
|
||||
tar-stream@^1.1.1, tar-stream@^1.1.2:
|
||||
tar-stream@^1.1.1:
|
||||
version "1.6.2"
|
||||
resolved "https://registry.yarnpkg.com/tar-stream/-/tar-stream-1.6.2.tgz#8ea55dab37972253d9a9af90fdcd559ae435c555"
|
||||
integrity sha512-rzS0heiNf8Xn7/mpdSVVSMAWAoy9bfb1WOTYC78Z0UQKeKa/CWS8FOq0lKGNa8DWKAn9gxjCvMLYc5PGXYlK2A==
|
||||
@@ -7746,7 +7733,7 @@ tar@^2.0.0:
|
||||
fstream "^1.0.2"
|
||||
inherits "2"
|
||||
|
||||
tar@^4, tar@^4.4.6:
|
||||
tar@^4, tar@^4.4.6, tar@^4.4.8:
|
||||
version "4.4.8"
|
||||
resolved "https://registry.yarnpkg.com/tar/-/tar-4.4.8.tgz#b19eec3fde2a96e64666df9fdb40c5ca1bc3747d"
|
||||
integrity sha512-LzHF64s5chPQQS0IYBn9IN5h3i98c12bo4NCO7e0sGM2llXQ3p2FGC5sdENN4cTW48O915Sh+x+EXx7XW96xYQ==
|
||||
|
||||
Reference in New Issue
Block a user