6.3 KiB
Testing & Continuous Integration for Volar Integrations
Docs Index • Repo README • Performance Guide • LSP Benchmarking
High-quality language tooling demands automated verification. This guide outlines every testing strategy—from unit tests that assert source-map correctness to full end-to-end LSP harnesses—plus tips for integrating them into CI pipelines.
Test Pyramid Overview
| Layer | What to test | Tools |
|---|---|---|
| Unit | CodeGen output, source maps, individual plugin hooks | Jest/Vitest/Mocha, snapshot assertions |
| Integration | Full language service behaviors (diagnostics, hover, completion) | vscode-languageserver/node test harness, custom scripts |
| End-to-end | Actual editor clients (VS Code extension, Neovim plugin) | VS Code Extension Tester, Neovim RPC tests, Playwright |
Unit Testing Building Blocks
CodeGen & Source Map Tests
import { CodeGen } from '@volar/code-gen';
test('maps interpolation correctly', () => {
const codeGen = new CodeGen();
codeGen.addText('const ctx = ');
codeGen.addCode('foo', { start: 10, end: 13 }, { navigation: true });
expect(codeGen.getText()).toMatchInlineSnapshot(`
"const ctx = foo"
`);
expect(codeGen.getMappings()).toEqual([
expect.objectContaining({
data: { navigation: true },
}),
]);
});
Plugin Hook Tests
Mock TextDocument instances and call your plugin functions directly:
const doc = TextDocument.create('file://test.foo', 'foo', 0, 'foo: value');
const diagnostics = fooServicePlugin.create(context).provideDiagnostics(doc);
expect(diagnostics).toMatchInlineSnapshot(`
[
{
"message": "Unknown key: foo",
"range": Object {
"end": Object { "character": 3, "line": 0 },
"start": Object { "character": 0, "line": 0 },
},
"severity": 2,
"source": "foo-plugin",
},
]
`);
Integration Testing Language Services
Create a mini harness that spins up your Volar language service and issues LSP requests.
import { createConnection } from 'vscode-languageserver/node';
import { createServer } from '@volar/language-server/node';
test('diagnostics for foo files', async () => {
const connection = createConnection();
const server = createServer(connection);
const project = createSimpleProject([fooLanguagePlugin]);
const diagnosticsPromise = once(connection, 'publishDiagnostics');
connection.sendNotification('textDocument/didOpen', {
textDocument: { uri: 'file://test.foo', text: 'foo: value', version: 1 },
});
const diagnostics = await diagnosticsPromise;
expect(diagnostics.diagnostics).toHaveLength(1);
});
Tips:
- Use Node’s
EventEmitter.oncepattern to await specific notifications. - Wrap the connection in a helper that lets you send requests and capture responses from your language service without spinning up a real client.
VS Code Extension Testing
If you ship a VS Code extension:
- Use
@vscode/test-electronto launch VS Code with your extension + Volar server. - Prepare a sample workspace under
test-fixtures/. - Write integration tests that open files, trigger commands, and inspect diagnostics via VS Code’s extension API.
import * as vscode from 'vscode';
suite('Diagnostics smoke test', () => {
test('foo file shows diagnostics', async () => {
const doc = await vscode.workspace.openTextDocument(vscode.Uri.file(FIXTURE));
await vscode.window.showTextDocument(doc);
await waitForDiagnostics(doc.uri);
const diagnostics = vscode.languages.getDiagnostics(doc.uri);
assert.strictEqual(diagnostics.length, 1);
});
});
Neovim / Other Editor Testing
- Wrap your server in a CLI (
node dist/server.js) and use Neovim’s RPC client to send LSP requests. - Use libraries like
vustedornvim-lspconfig’s internal test harness for Lua-based tests.
Snapshot Testing
- Use Jest/Vitest snapshots for diagnostics/completions/hovers. Keep fixtures small and redacted to avoid brittle diffs.
- When snapshots change, review carefully; schema or plugin updates often alter expected output.
Mocking External Dependencies
- Schema fetchers: stub
fetch/readFileto return deterministic content. - File system watchers: mock
fs.watchto emit synthetic events. - TypeScript APIs: replace
ts.createLanguageServicewith a fake service for unit tests.
Continuous Integration Pipeline
- Install dependencies –
npm ci. - Lint & type-check –
npm run lint && npm run typecheck. - Unit tests –
npm test. - Integration tests – run harness scripts (Node-based) and, if feasible, headless VS Code tests.
- Bundle check – build the server (
npm run build) to ensure TypeScript compilation succeeds. - Optional – run
tsc --noEmiton your fixtures to ensurevueCompilerOptions/tsconfigreferences stay valid.
Example GitHub Actions Workflow
name: CI
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
- run: npm ci
- run: npm run lint
- run: npm run typecheck
- run: npm test
- run: npm run build
Regression Reproduction
- Capture the failing file(s) as fixtures and commit them under
test-fixtures/. - Add a test that reproduces the issue via the harness (e.g., send
textDocument/hoverand assert the response). - Fix the bug, ensure the new test passes, and keep the fixture for future regressions.
Manual QA Checklist
Even with robust automated tests, run manual smoke tests before releases:
- Open a representative workspace in VS Code with your server extension.
- Verify diagnostics, completions, hovers, rename, code actions, references, and workspace diagnostics.
- Repeat in Neovim or another target editor if you support it.
- Run
vue-tsc --noEmit(or your CLI equivalent) to ensure TypeScript-based checks align with editor behavior.
By investing in tests at every layer, your Volar integrations will remain stable as the ecosystem evolves. Automate everything from code-gen mapping checks to full VS Code sessions, and integrate the suite into CI so regressions never reach users.