### Related Issues
Related - https://github.com/vercel/api/pull/15027
Removing setting the output directory placeholder - it is not extremely reliable https://vercel.slack.com/archives/C02HEJASXGD/p1667234137439189?thread_ts=1667232443.320769&cid=C02HEJASXGD
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR
### Related Issues
Add in placeholder settings for monorepos
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
### Related Issues
Parse `rush.json` files with `json5` because it is very common for these
to have comments in them
[Template for people to clone for
Rush](https://rushjs.io/pages/configs/rush_json/) which has comments in
it as a default which most people will clone
Docs of Rush showing to not use `JSON.parse`
https://rushjs.io/pages/help/faq/#why-do-rushs-json-config-files-contain--comments-that-github-shows-in-red
Added in tests with block comments and single line comments
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR
Co-authored-by: Sean Massa <EndangeredMassa@gmail.com>
This is to fix the issue with processing new entries of `potentialFiles` when there was a `readdir` cache hit
### Related Issues
Related to https://linear.app/vercel/issue/HIT-57/monorepo-detection-api-prevent-rate-limits
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
### Related Issues
Add in getting the package paths for Rush workspaces.
First get the contents of the `rush.json` file. If the `projects`
property is an array, map through that array and return the
`projectFolder` as the package path.
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR
### Related Issues
Add in Rush workspace
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR
### Related Issues
https://nx.dev/more-concepts/folder-structure#integrated-repo-folder-structure
Nx monorepo has an option to use Nx workspaces.
Nx workspace is defined within the root `workspace.json` file
Within this `workspace.json` file the workspace packages are under
projects
```{
"$schema": "./node_modules/nx/schemas/workspace-schema.json",
"version": 2,
"projects": {
"myblog": "apps/myblog",
"svelte-app": "apps/svelte-app",
}
}
```
Within `getNxWorkspacePackagePaths` get the projects object values for
the paths
Nx is listed as the last workspace manager within
`workspace-managers.ts` because other workspace managers could exist to
check for before Nx workspaces because the `workspace.json` could exist
but not be the correct workspace manager
Nx workspace file can exist when yarn/npm workspaces exist. When this
happens, the workspace.json file is empty with no projects so it will
not add any package paths to the list to look through for projects.
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [ ] The code changed/added as part of this PR has been covered with
tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a
reviewer
- [ ] Issue from task tracker has a link to this PR
### Related Issues
Adding in Nx and Rush as monorepo managers.
This will allow to help with starting zero config for both the above managers.
I have added in unit tests for both Nx and Rush.
### 📋 Checklist
#### Tests
- [ ] The code changed/added as part of this PR has been covered with tests
- [ ] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
Adds a `writeFile` function to `DetectorFilesystem` that will be used to update the various file cache maps.
**Why is this needed?**
When detecting npm7+ monorepos, we identified a performance improvement where the service can inspect the `package-lock.json` file for workspaces, and reuse the package information for each workspace in framework-detection.
The pseudo code in `vercel/api` will look something like this
For a given lockfile
```json
{
...,
"packages": {
"": {
"name": "npm-workspaces",
"version": "1.0.0",
"license": "ISC",
"workspaces": {
"packages": [
"apps/*"
]
}
},
"apps/admin": {
"version": "0.1.0",
"dependencies": {
"next": "12.2.5",
"react": "18.2.0",
"react-dom": "18.2.0"
},
"devDependencies": {
"eslint": "8.23.0",
"eslint-config-next": "12.2.5"
}
},
...,
}
```
```ts
// for each projectPath we detect in package-lock.json
// switch the cwd of the fs to the project directory
const projectFs = fs.chdir(projectPath);
// gets the package info from the lockfile
const projectPackageInfo = lockFile.packages[projectPath];
// insert this content into fs cache
await projectFs.writeFile('package.json', projectPackageInfo)
// call detectFramework, which should now have a cached "package.json" file
const projectFramework = await detectFramework(projectFs);
```
### Related Issues
Related to https://linear.app/vercel/issue/HIT-57/monorepo-detection-api-prevent-rate-limits
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`
#### Code Review
- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
Adding the ability to explicitly set the `hasPath` and `isFile` caches via `readdir`
**Why is this needed?**
For framework detection, this allows us to fetch the contents of a directory (the root directory) of a git repo, and make assumptions about the existence of files,
```ts
import frameworks from "@vercel/frameworks";
// we can make the set of all file paths needed for framework detection like this
const setOfPaths = new Set(
frameworks.frameworks
.flatMap((f) => [
...(f.detectors?.every ?? []),
...(f.detectors?.some ?? []),
])
.map((d) => d.path)
);
// then we can read the contents of the root directory of a git repo via the `DetectorFilesystem`
fs.readdir('./', { potentialFiles: [...setOfPaths] });
// The filesystem has been fully primed - any network calls from here will only happen if needed.
// This works because the logic of `detectFramework` calls `hasPath` and then `isFile` before
// fetching file contents. Since the filesystem knows this information, it doesn't have to make
// any additional network calls 🙌
const framework = detectFramework(fs);
```
### Related Issues
Related to [HIT-57](https://linear.app/vercel/issue/HIT-57/monorepo-detection-api-prevent-rate-limits)
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`
#### Code Review
- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
Adds a `writeFile` function to `DetectorFilesystem` that will be used to update the various file cache maps.
When detecting npm7+ monorepos, we identified a performance improvement where the service can inspect the `package-lock.json` file for workspaces, and reuse the package information for each workspace in framework-detection.
The pseudo code in `vercel/api` will look something like this
For a given lockfile
```json
{
...,
"packages": {
"": {
"name": "npm-workspaces",
"version": "1.0.0",
"license": "ISC",
"workspaces": {
"packages": [
"apps/*"
]
}
},
"apps/admin": {
"version": "0.1.0",
"dependencies": {
"next": "12.2.5",
"react": "18.2.0",
"react-dom": "18.2.0"
},
"devDependencies": {
"eslint": "8.23.0",
"eslint-config-next": "12.2.5"
}
},
...,
}
```
```ts
// for each projectPath we detect in package-lock.json
// switch the cwd of the fs to the project directory
const projectFs = fs.chdir(projectPath);
// gets the package info from the lockfile
const projectPackageInfo = lockFile.packages[projectPath];
// insert this content into fs cache
projectFs.writeFile('package.json', projectPackageInfo)
// call detectFramework, which should now have a cached "package.json" file
const projectFramework = await detectFramework(projectFs);
```
### Related Issues
Related to [HIT-57](https://linear.app/vercel/issue/HIT-57/monorepo-detection-api-prevent-rate-limits)
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`
#### Code Review
- [ ] This PR has a concise title and thorough description useful to a reviewer
- [ ] Issue from task tracker has a link to this PR
This PR consolidates all the `test` scripts to be the same and removes the `prepublishOnly` script since we always run `build` before publishing to npm.
This PR update the tests suite to wait for Vercel CLI tarball and then use that tarball to run E2E tests.
This is valuable because it will package all the packages in this monorepo to make the tests follow more closely what will happen in production once merged.
Since the current "Find Changes" step takes about 2 minutes, we run that first (it happens concurrently with the tarball preparation). Then once we complete "Find Changes" we wait for the tarball but it will likely be ready by that point since it also takes about 2 minutes. After both of those steps, the E2E tests continue as usual but with the `VERCEL_CLI_VERSION` set to the tarball.
- Related to #7967
- Closes#8245
- Closes#8227
This is a semver major change to the public API for `@vercel/routing-utils` which includes the following breaking changes.
1. `getTransformedRoutes({ nowConfig })` props changed to `getTransformedRoutes(nowConfig)`
2. `type Source` renamed `type RouteWithSrc`
3. `type Handler` renamed `type RouteWithHandle`
4. `interface VercelConfig` removed
5. `type NowConfig` removed
6. `type NowRewrite` removed
7. `type NowRedirect` removed
8. `type NowHeader` removed
9. `type NowHeaderKeyValue` removed
"solidstart" framework doesn't need to use the FS API v2 Builder anymore since:
1) Newer versions output Build Output API v3
2) Older versions that output FS API v2 will be handled by the compatbility shim that was added in https://github.com/vercel/vercel/pull/7690
The Vercel API's `detect-framework` API contacts GitHub's API using its own file adapter.
However, currently the `detectFramework` function in the Vercel CLI (on which the API depends) calls the file system adapter in series using a `for` loop. This results in large amounts of lag when in a networked situation such as calling GitHub's API as each API call takes a couple hundred milliseconds.
I propose changing `detectFramework` to process the directories it scans in parallel.
### 📋 Checklist
<!--
Please keep your PR as a Draft until the checklist is complete
-->
#### Tests
- [x] The code changed/added as part of this PR has been covered with tests
- [x] All tests pass locally with `yarn test-unit`
#### Code Review
- [x] This PR has a concise title and thorough description useful to a reviewer
- [x] Issue from task tracker has a link to this PR
The `@vercel/build-utils` package was meant be shared functions necessary for writing a Vercel Builder (aka Runtime).
This package has since bloated into the catch-all package for anything that wasn't a Builder.
This PR removes the bloat in favor of a new package, `@vercel/fs-detectors`. It also removes the need for `@vercel/build-utils` to have a dependency on `@vercel/frameworks`.
- Related to #7951