Files
volar-docs/docs/testing-and-ci.md
2025-11-09 22:22:52 -06:00

6.3 KiB
Raw Permalink Blame History

Testing & Continuous Integration for Volar Integrations

Docs IndexRepo READMEPerformance GuideLSP Benchmarking

High-quality language tooling demands automated verification. This guide outlines every testing strategy—from unit tests that assert source-map correctness to full end-to-end LSP harnesses—plus tips for integrating them into CI pipelines.

Test Pyramid Overview

Layer What to test Tools
Unit CodeGen output, source maps, individual plugin hooks Jest/Vitest/Mocha, snapshot assertions
Integration Full language service behaviors (diagnostics, hover, completion) vscode-languageserver/node test harness, custom scripts
End-to-end Actual editor clients (VS Code extension, Neovim plugin) VS Code Extension Tester, Neovim RPC tests, Playwright

Unit Testing Building Blocks

CodeGen & Source Map Tests

import { CodeGen } from '@volar/code-gen';

test('maps interpolation correctly', () => {
  const codeGen = new CodeGen();
  codeGen.addText('const ctx = ');
  codeGen.addCode('foo', { start: 10, end: 13 }, { navigation: true });

  expect(codeGen.getText()).toMatchInlineSnapshot(`
"const ctx = foo"
`);
  expect(codeGen.getMappings()).toEqual([
    expect.objectContaining({
      data: { navigation: true },
    }),
  ]);
});

Plugin Hook Tests

Mock TextDocument instances and call your plugin functions directly:

const doc = TextDocument.create('file://test.foo', 'foo', 0, 'foo: value');
const diagnostics = fooServicePlugin.create(context).provideDiagnostics(doc);
expect(diagnostics).toMatchInlineSnapshot(`
[
  {
    "message": "Unknown key: foo",
    "range": Object {
      "end": Object { "character": 3, "line": 0 },
      "start": Object { "character": 0, "line": 0 },
    },
    "severity": 2,
    "source": "foo-plugin",
  },
]
`);

Integration Testing Language Services

Create a mini harness that spins up your Volar language service and issues LSP requests.

import { createConnection } from 'vscode-languageserver/node';
import { createServer } from '@volar/language-server/node';

test('diagnostics for foo files', async () => {
  const connection = createConnection();
  const server = createServer(connection);
  const project = createSimpleProject([fooLanguagePlugin]);

  const diagnosticsPromise = once(connection, 'publishDiagnostics');
  connection.sendNotification('textDocument/didOpen', {
    textDocument: { uri: 'file://test.foo', text: 'foo: value', version: 1 },
  });

  const diagnostics = await diagnosticsPromise;
  expect(diagnostics.diagnostics).toHaveLength(1);
});

Tips:

  • Use Nodes EventEmitter.once pattern to await specific notifications.
  • Wrap the connection in a helper that lets you send requests and capture responses from your language service without spinning up a real client.

VS Code Extension Testing

If you ship a VS Code extension:

  1. Use @vscode/test-electron to launch VS Code with your extension + Volar server.
  2. Prepare a sample workspace under test-fixtures/.
  3. Write integration tests that open files, trigger commands, and inspect diagnostics via VS Codes extension API.
import * as vscode from 'vscode';

suite('Diagnostics smoke test', () => {
  test('foo file shows diagnostics', async () => {
    const doc = await vscode.workspace.openTextDocument(vscode.Uri.file(FIXTURE));
    await vscode.window.showTextDocument(doc);

    await waitForDiagnostics(doc.uri);
    const diagnostics = vscode.languages.getDiagnostics(doc.uri);
    assert.strictEqual(diagnostics.length, 1);
  });
});

Neovim / Other Editor Testing

  • Wrap your server in a CLI (node dist/server.js) and use Neovims RPC client to send LSP requests.
  • Use libraries like vusted or nvim-lspconfigs internal test harness for Lua-based tests.

Snapshot Testing

  • Use Jest/Vitest snapshots for diagnostics/completions/hovers. Keep fixtures small and redacted to avoid brittle diffs.
  • When snapshots change, review carefully; schema or plugin updates often alter expected output.

Mocking External Dependencies

  • Schema fetchers: stub fetch/readFile to return deterministic content.
  • File system watchers: mock fs.watch to emit synthetic events.
  • TypeScript APIs: replace ts.createLanguageService with a fake service for unit tests.

Continuous Integration Pipeline

  1. Install dependencies npm ci.
  2. Lint & type-check npm run lint && npm run typecheck.
  3. Unit tests npm test.
  4. Integration tests run harness scripts (Node-based) and, if feasible, headless VS Code tests.
  5. Bundle check build the server (npm run build) to ensure TypeScript compilation succeeds.
  6. Optional run tsc --noEmit on your fixtures to ensure vueCompilerOptions/tsconfig references stay valid.

Example GitHub Actions Workflow

name: CI
on: [push, pull_request]
jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: 20
      - run: npm ci
      - run: npm run lint
      - run: npm run typecheck
      - run: npm test
      - run: npm run build

Regression Reproduction

  1. Capture the failing file(s) as fixtures and commit them under test-fixtures/.
  2. Add a test that reproduces the issue via the harness (e.g., send textDocument/hover and assert the response).
  3. Fix the bug, ensure the new test passes, and keep the fixture for future regressions.

Manual QA Checklist

Even with robust automated tests, run manual smoke tests before releases:

  1. Open a representative workspace in VS Code with your server extension.
  2. Verify diagnostics, completions, hovers, rename, code actions, references, and workspace diagnostics.
  3. Repeat in Neovim or another target editor if you support it.
  4. Run vue-tsc --noEmit (or your CLI equivalent) to ensure TypeScript-based checks align with editor behavior.

By investing in tests at every layer, your Volar integrations will remain stable as the ecosystem evolves. Automate everything from code-gen mapping checks to full VS Code sessions, and integrate the suite into CI so regressions never reach users.