Introduction
Tix is a type checker for the Nix language. It infers types for your Nix code and catches errors statically — without running anything.
Most code needs zero annotations. When inference isn’t enough (e.g. typing lib from nixpkgs), you fill in the gaps with doc comments or .tix stub files.
Why
Nix is dynamically typed. This is fine for small configs but gets painful in larger codebases — you have to run code (or read it very carefully) to find type errors. Tix catches them statically.
The philosophy: infer as much as possible, but defer to lightweight annotations when it would be too hard to infer. Nix’s import system, with blocks, and the sheer size of nixpkgs make full inference impractical. Instead, Tix infers what it can and lets you fill in the gaps.
What you get
- Type inference — most code needs zero annotations
- Union types —
if-then-elsewith different branches, heterogeneous lists - Type narrowing — null checks,
? fieldguards, andis*builtins refine types in branches - Row polymorphism — functions that access
x.foowork on any attrset with afoofield - Operator overloading —
+on ints, floats, strings, and paths - Doc comment annotations — when inference needs help
.tixstub files — declare types for external code (nixpkgs lib, etc.)- Stub generation — auto-generate stubs from NixOS/Home Manager option trees
- LSP — hover, completions, go-to-definition, rename, diagnostics, inlay hints, formatting
Quick Start
Get tix running on an existing Nix project in under five minutes.
1. Install
The fastest way — no install required:
nix run github:JRMurr/tix -- inspect my-file.nix
For permanent use, add to your flake’s devShell or install with nix profile:
nix profile install github:JRMurr/tix
See Getting Started for all installation options.
2. Initialize your project
From your project root:
tix init
This scans your .nix files, classifies them (NixOS module, Home Manager module, callPackage, etc.), and writes a tix.toml with context sections, stub generation config, and [project] includes patterns for LSP background analysis. Preview first with tix init --dry-run.
3. Stub generation
For flake projects, tix init auto-detects your nixpkgs (and home-manager) inputs from flake.lock and adds a [stubs.generate] section. On first run, tix builds rich type stubs from your nixpkgs (~30-60s); subsequent runs are cached.
For non-flake projects, add the section manually:
[stubs.generate]
# any nix expression that resolves to a nixpkgs path
nixpkgs = { expr = "(<import pinned_nixpkgs>).path" }
See Configuration > Runtime stub generation for details.
4. Check your project
tix check
This type-checks every .nix file in your project, applying the contexts from tix.toml. Files are processed in dependency order so types flow across imports.
Or use the LSP for inline feedback in your editor:
tix lsp
See LSP for editor setup instructions.
5. Suppress false positives
Tix won’t understand everything — some files may produce errors you want to silence for now.
Suppress all diagnostics for a file:
# tix-nocheck
{ config, lib, pkgs, ... }:
{
# nothing in this file is checked
}
Suppress a single line:
let
# tix-ignore
x = somethingTixDoesntUnderstand;
in
x
See Configuration > Suppression directives for more.
Getting Started
Install
Try without installing
The fastest way to try tix — no installation required:
nix run github:JRMurr/tix -- inspect my-file.nix
Nix Flake (recommended)
Add tix to your flake inputs:
{
inputs.tix.url = "github:JRMurr/tix";
}
Then add the package to your system configuration:
# configuration.nix (NixOS)
{ inputs, pkgs, ... }:
{
environment.systemPackages = [
inputs.tix.packages.${pkgs.system}.default
];
}
Or add it to a dev shell:
{
inputs.tix.url = "github:JRMurr/tix";
outputs = { self, nixpkgs, tix, ... }:
let
pkgs = nixpkgs.legacyPackages.x86_64-linux;
in {
devShells.x86_64-linux.default = pkgs.mkShell {
buildInputs = [
tix.packages.x86_64-linux.default
];
};
};
}
Or install imperatively with nix profile:
nix profile install github:JRMurr/tix
Without flakes
Add to a traditional NixOS configuration via fetchTarball:
# configuration.nix
let
tix = import (builtins.fetchTarball "https://github.com/JRMurr/tix/archive/main.tar.gz") {};
in
{
environment.systemPackages = [
tix.packages.${builtins.currentSystem}.default
];
}
Pin to a specific revision for reproducibility:
let
tix = import (builtins.fetchTarball {
url = "https://github.com/JRMurr/tix/archive/<rev>.tar.gz";
sha256 = "<hash>"; # nix-prefetch-url --unpack <url>
}) {};
in
tix.packages.${builtins.currentSystem}.default
Or install imperatively with nix-env:
nix-env -f https://github.com/JRMurr/tix/archive/main.tar.gz -iA packages.x86_64-linux.default
Build from source
git clone https://github.com/JRMurr/tix
cd tix
cargo build --release
# Binary at target/release/tix
Or with nix:
nix build .#
Usage
Type-check a file
tix inspect my-file.nix
This prints the inferred type of each top-level binding and the root expression.
With stubs
tix inspect my-file.nix --stubs ./my-stubs/
--stubs accepts file paths or directories (recursively finds .tix files). Can be passed multiple times. The built-in nixpkgs stubs are loaded by default — use --no-default-stubs to disable.
Initialize a project
Scaffold a tix.toml by scanning your project for .nix files:
tix init
This classifies each file (NixOS module, Home Manager module, callPackage, overlay, etc.) and generates context sections automatically. Use --dry-run to preview without writing:
tix init --dry-run
Check a project
Type-check all files in a project using the tix.toml configuration:
tix check
This discovers all .nix files, applies context from tix.toml, and type-checks them in parallel using layered inference. Files are sorted by their import dependencies so that types flow between files — if a.nix imports b.nix, b.nix is inferred first and a.nix gets its real type. It also validates that file classifications match their configured contexts (e.g., warns if a NixOS module isn’t in any [context.nixos] section).
tix check --verbose # Show file classifications
tix check --config path/to/tix.toml # Explicit config path
tix check -j 4 # Limit to 4 parallel inference threads
Exit code is 1 if any type errors are found, 0 otherwise (config warnings don’t affect the exit code).
Machine-readable output
For CI pipelines and tool integration, use --format json to get structured JSON output on stdout:
tix inspect my-file.nix --format json
tix check --format json
The JSON schema includes diagnostics with file paths, 1-indexed line/column locations, severity, error codes, and documentation URLs. Single-file mode also includes inferred bindings and the root type.
{
"version": 1,
"files": [
{
"file": "my-file.nix",
"diagnostics": [
{
"severity": "error",
"code": "E001",
"message": "type mismatch: expected `string`, got `int`",
"line": 5,
"column": 3,
"end_line": 5,
"end_column": 8,
"url": "https://jrmurr.github.io/tix/diagnostics/e001.html"
}
]
}
],
"summary": {
"files_checked": 1,
"errors": 1,
"warnings": 0
},
"bindings": { "x": "int" },
"root_type": "int"
}
The bindings and root_type fields are only present in single-file mode. The version field allows for future schema evolution.
Full type output
By default, large types are truncated for readability (fields, union members, nesting depth, and total characters are bounded). To see complete types without truncation:
tix inspect my-file.nix --full-types
Timing and profiling
Show per-phase timing and RSS memory usage:
tix inspect my-file.nix --timing
tix check --timing
This prints a breakdown of wall-clock time and memory for each pipeline phase (registry loading, parsing, name resolution, inference, diagnostics).
For detailed heap profiling, build with the dhat-heap feature:
cargo build --release --features dhat-heap
tix inspect my-file.nix # produces dhat-heap.json
View the result at dhat-viewer.
Generate stubs
Generate typed stubs from your NixOS or Home Manager configuration:
# From a flake
tix stubs generate nixos --flake . --hostname myhost -o nixos.tix
tix stubs generate home-manager --flake . --username jr -o hm.tix
# From nixpkgs directly
tix stubs generate nixos --nixpkgs /path/to/nixpkgs -o nixos.tix
See the Stubs chapter for details.
LSP
tix lsp
Communicates over stdin/stdout. Works with any LSP-compatible editor. Stubs are loaded from tix.toml and editor settings.
Features: hover types, completions (dot access, function args, identifiers), go-to-definition, find references, rename, inlay hints, document symbols, semantic tokens, formatting (via nixfmt).
Type System
Tix infers types from your code — most Nix code needs zero annotations. This page covers what types Tix understands and how they work in practice.
Primitives
| Type | Nix values |
|---|---|
int | 1, 42, -3 |
float | 3.14, 1.0 |
string | "hello", ''multi-line'' |
bool | true, false |
path | ./foo, /nix/store/... |
null | null |
Functions
Functions are inferred from usage. The parameter type comes from how it’s used in the body, and the return type is whatever the body produces.
# id :: a -> a
id = x: x;
# apply :: (a -> b) -> a -> b
apply = f: x: f x;
# negate :: bool -> bool
negate = x: !x;
Nix functions are curried — f: x: f x is a function that takes f and returns a function that takes x.
Callable attrsets (__functor)
In Nix, an attrset with a __functor field can be called as a function. The __functor field must be a function that takes the attrset itself (self) as its first argument, followed by the actual parameter:
let
counter = {
__functor = self: x: self.base + x;
base = 10;
};
in counter 5 # 15
Tix understands this calling convention. Callable attrsets can be passed to higher-order functions that expect functions:
let
apply = f: f 1;
obj = { __functor = self: x: x + 1; };
in apply obj # 2
Union types
When an expression can produce different types, tix infers a union.
# if-then-else with different branches
x = if cond then 1 else "fallback";
# x :: int | string
# heterogeneous lists
xs = [ 1 "two" null ];
# xs :: [int | string | null]
Unlike Rust enums or Haskell sum types, unions don’t need to be declared upfront — they’re inferred automatically from the code.
Type narrowing
When a condition checks whether a variable is null, has a specific field, or is a particular type, tix narrows the variable’s type in each branch. This prevents false errors from idiomatic guard patterns.
Null guards
getName = drv:
if drv == null then "<none>"
else drv.name;
# getName :: { name: a, ... } -> a | string
# drv is null in then-branch, non-null in else-branch
HasAttr (?) guards
getField = arg:
if arg ? escaped then arg.escaped
else if arg ? unescaped then arg.unescaped
else null;
# each branch narrows arg to have the checked field
Only single-key attrpaths are supported (x ? field, not x ? a.b.c).
Type predicate guards
All is* builtins are recognized as narrowing guards, whether called directly (isString x), qualified (builtins.isString x), or through a select chain (lib.isString x). In the then-branch, the variable is narrowed to the corresponding type. In the else-branch, the checked type is excluded:
dispatch = x:
if isString x then builtins.stringLength x
else if isInt x then x + 1
else if isBool x then !x
else null;
# each branch sees x as the appropriate type
Structural predicates (isAttrs, isList, isFunction) narrow in the then-branch only — isAttrs x narrows x to an attrset, etc. Else-branch narrowing for these is not yet supported.
Supported narrowing conditions
x == null/null == x/x != null/null != xisNull x/builtins.isNull xisString x/builtins.isString x/lib.isString x(and similarly for allis*builtins)x ? field/builtins.hasAttr "field" x— narrows x to have the field in then-branch, not have it in else-branch!cond— flips the narrowingassert cond; body— narrows in the bodycond1 && cond2— both narrowings apply in the then-branchcond1 || cond2— both narrowings apply in the else-branch
Boolean combinators
&& and || combine multiple narrowing conditions:
# &&: both guards hold in the then-branch
safeGet = x:
if x != null && x ? name then x.name
else "default";
# then-branch: x is non-null AND has field `name`
# ||: both guards fail in the else-branch
dispatch = x:
if isString x || isInt x then doSomething x
else x;
# else-branch: x is neither string nor int
&& and || also apply short-circuit narrowing to sub-expressions. Since a && b only evaluates b when a is true, b is inferred under a’s then-branch narrowing:
# ||: x is non-null in the RHS (runs when x == null is false)
safe = x: x == null || x + 1 > 0;
# &&: x is non-null in the RHS (runs when x != null is true)
safe = x: x != null && isString x.name;
Conditional library functions
Several nixpkgs lib functions take a boolean guard as their first argument and only evaluate the second argument when the guard is true. Tix recognizes these and applies narrowing to the guarded argument:
{ x }:
let
# x.name is safe — tix narrows x to non-null in the second argument
name = lib.optionalString (x != null) x.name;
in name
Recognized functions:
lib.optionalString/lib.strings.optionalStringlib.optionalAttrs/lib.attrsets.optionalAttrslib.optional/lib.lists.optionallib.mkIf
The detection is name-based, so lib.strings.optionalString, lib.optionalString, and a bare optionalString from with lib; are all recognized.
Row polymorphism (open attrsets)
Functions that access attrset fields get inferred types that are open — they accept any attrset that has the required fields.
# getName :: { name: a, ... } -> a
getName = x: x.name;
# works on any attrset with a `name` field
getName { name = "alice"; age = 30; } # "alice"
getName { name = 42; extra = true; } # 42
The ... in the inferred type means “and maybe other fields.” This is how Nix’s pattern destructuring works too:
# greet :: { name: string, ... } -> string
greet = { name, ... }: "hello ${name}";
Optional fields (pattern defaults)
When a lambda pattern has fields with defaults (? value), those fields are marked as optional in the inferred type. Callers can omit optional fields without triggering a missing-field error.
# mkGreeting :: { name: string, greeting?: string } -> string
mkGreeting = { name, greeting ? "hello" }: "${greeting} ${name}";
mkGreeting { name = "alice"; } # "hello alice"
mkGreeting { name = "bob"; greeting = "hey"; } # "hey bob"
Optional fields are shown with a ? suffix in the inferred type. Required fields (no default) still produce an error if omitted:
# This is fine — `y` is optional:
({ x, y ? 0 }: x + y) { x = 1; } # 1
# This errors — `y` is required:
({ x, y }: x + y) { x = 1; } # error: missing field `y`
Attrset merge (//)
The merge operator produces a type that combines both sides. The right side wins for overlapping fields.
base = { a = 1; b = "two"; };
override = { b = 3; c = true; };
merged = base // override;
# merged :: { a: int, b: int, c: bool }
Operator overloading
+ is overloaded across several types:
| Left | Right | Result |
|---|---|---|
int | int | int |
float | float | float |
string | string | string |
path | path | path |
path | string | path |
string | path | string |
Other arithmetic operators (-, *, /) work on int and float.
When tix can see the concrete types of the operands, it resolves the overload immediately. When the types are still polymorphic (e.g. in a generic function), resolution is deferred until more information is available.
Let polymorphism
Bindings introduced with let are generalized — they can be used at different types in the body.
let
id = x: x;
in {
a = id 1; # int
b = id "hello"; # string
}
Each use of id gets a fresh copy of the type, so id can be applied to both int and string without conflict.
Recursive bindings
Tix handles recursive and mutually recursive definitions by analyzing dependency structure and inferring each group together.
let
fib = n: if n < 2 then n else fib (n - 1) + fib (n - 2);
in fib 10
# fib :: int -> int
Builtins
Tix knows the types of ~75 Nix builtins. Some examples:
builtins.map :: (a -> b) -> [a] -> [b]
builtins.filter :: (a -> bool) -> [a] -> [a]
builtins.head :: [a] -> a
builtins.attrNames :: { ... } -> [string]
builtins.length :: [a] -> int
builtins.typeOf :: a -> string
Unknown builtins get a fresh type variable — they won’t cause errors, but they won’t provide type information either.
Unknown types (?)
When a binding’s entire type is unconstrained, tix displays it as ? instead of a letter:
craneLib :: ? # unconstrained — entire type is unknown
id :: a -> a # compound type — letters preserved
const :: a -> b -> a # compound type — all params get letters
Lambda parameters always keep letter names since they represent genuine polymorphism. Other bindings (let, attrset fields) show ? when their entire inferred type is a single unconstrained variable.
Type Annotations
TLDR: Annotate bindings with doc comments when inference isn’t enough. Three flavors: nixdoc-style # Type sections, inline /** type: name :: Type */ block comments, and # type: name :: Type line comments.
When you need annotations
Most code doesn’t need annotations — tix infers types from usage. Annotations help when:
- You’re importing code via a path tix can’t resolve (e.g.
import <nixpkgs>, dynamic paths) - You want to constrain a binding to a specific type
- You want to document your API
Doc comment format
Nixdoc-style (multiline)
Follows the nixdoc convention. The type goes in a fenced code block under a # Type heading:
/**
Concatenate a list of strings with a separator.
# Type
```
concatStringsSep :: string -> [string] -> string
```
*/
concatStringsSep = sep: list: builtins.concatStringsSep sep list;
Inline type annotation
For quick one-liners, use either block comments or line comments:
/** type: lib :: Lib */
lib = import <nixpkgs/lib>;
/** type: add :: int -> int -> int */
add = a: b: a + b;
Line comments work too — handy for attrset pattern parameters where /** feels heavy:
{
# type: pkgs :: Pkgs
pkgs,
# type: lib :: Lib
lib,
}: ...
The type: prefix distinguishes annotations from regular comments.
Assigning type aliases
When you import something typed by stubs, you assign it a type alias:
/** type: lib :: Lib */
lib = import <nixpkgs/lib>;
/** type: pkgs :: Pkgs */
pkgs = import <nixpkgs> {};
Lib and Pkgs are type aliases defined in the built-in stubs (or your custom .tix files). This tells tix “trust me, this import produces a value of this type.”
Type expression syntax
The same syntax works in doc comments and .tix stub files.
Casing matters: lowercase names like a and b are generic type variables (implicitly universally quantified), while uppercase names like Foo or Lib are references to type aliases. This is how the parser tells them apart — val id :: a -> a means “for any type a”, whereas val f :: Lib -> Lib means “takes and returns the specific Lib type”. This is also why module lib { ... } generates a capitalized alias Lib: the module’s name is lowercase (matching Nix convention), but its type alias must be uppercase to be usable in type expressions.
| Syntax | Meaning |
|---|---|
int, string, bool, float, path, null | Primitives |
Int, String, Bool, Float, Path, Null | Uppercase aliases (same as lowercase) |
a, b (lowercase) | Generic type variables |
Foo (uppercase) | Type alias reference |
[a] | List of a |
a -> b | Function (right-associative) |
a | b | Union |
a & b | Intersection |
{ name: string, age: int } | Closed attrset |
{ name: string, ... } | Open attrset |
{ _: int } | Dynamic field type (all values are int) |
| typeof varname | Inferred type of a binding |
| typeof import("./path.nix") | Inferred root type of another file |
| import("./path.nix").Name | Type declaration from another file |
| Param(T) | Parameter type of a function type |
| Return(T) | Return type of a function type |
| T.key | Field type from an attrset type |
Precedence (low to high): -> then | then & then atoms. Use parens to override.
(int | string) -> bool # function from union to bool
(int -> int) & (string -> string) # intersection of two function types
Param(typeof f) # parameter type of f
See Cross-File Types for details on typeof, type operators, and cross-file type imports.
Uppercase primitives
Nixpkgs doc comments conventionally use uppercase names like String, Bool, and Int. Tix recognizes these as aliases for the lowercase primitives, so both string and String work in annotations.
Inline Type Aliases
You can define type aliases directly in a .nix file using doc comments, without needing a separate .tix stub file:
/** type Derivation = { name: string, src: path, ... }; */
# type Nullable = a | null;
let
/** type: mkDrv :: { name: string, ... } -> Derivation */
mkDrv = { name, src, ... }: { inherit name; system = "x86_64-linux"; };
in ...
Both block (/** ... */) and line (# type Foo = ...;) comments work. The syntax is exactly the same as in .tix stub files — type Name = TypeExpr;.
Inline aliases are file-scoped (visible everywhere in the file regardless of placement) and shadow any aliases with the same name from loaded stubs.
Disambiguation: type: (with a colon) triggers a binding annotation (type: name :: Type). type (with a space followed by an uppercase letter) triggers an alias declaration (type Name = ...;).
Annotation safety
Tix checks that annotations are compatible with the inferred types. In a few cases, annotations are accepted without full verification — a warning is emitted so you know the annotation is trusted rather than checked:
Arity mismatch. If the annotation has fewer arrows than the function’s visible lambda parameters (e.g. foo :: string -> string on a two-argument function x: y: ...), the annotation is skipped. An annotation with more arrows than visible lambdas is fine — the body may return a function.
Union types. Annotations containing union types (e.g. f :: string -> (string | [string]) -> string) are currently trusted without verification. The function is still type-checked based on its body alone.
Intersection types (overloaded functions). Annotations with intersection function types (e.g. (int -> int) & (string -> string)) are accepted as declared types for callers but the individual overloads aren’t verified against the body. This is useful for declaring overloaded APIs where the implementation dispatches with type guards (builtins.isInt, etc.).
Named alias display in functions
When a function annotation references type aliases for its parameters, tix propagates the alias names through the function’s lambda structure. This means the alias name is displayed instead of the expanded structural type:
# In a .tix stub:
# type BwrapArg = { escaped: string, ... } | string;
/** type: renderArg :: BwrapArg -> string */
renderArg = arg: ...;
Hovering over renderArg displays BwrapArg -> string rather than the fully expanded union/intersection type. This works for curried functions too — each parameter position preserves its alias name independently.
Cross-File Types
Tix supports type-level operators and cross-file type sharing, reducing the need for external .tix stub files for your own code.
typeof — Reference Inferred Types
Use typeof varname in type annotations to reference the inferred type of a binding:
let
scope = { mkDerivation = ...; lib = ...; };
/** type: narrowed :: typeof scope */
narrowed = scope;
in narrowed
The referenced binding must be in an earlier SCC group (already inferred). Mutually recursive bindings cannot use typeof on each other.
Type Operators
Param(T) and Return(T)
Extract the parameter or return type from a function type:
# Given: type F = int -> string -> bool;
# Param(F) → int
# Return(F) → string -> bool
# Param(Return(F)) → string
These work on type aliases, typeof results, and any type expression that resolves to a function:
let
f = a: a + 1;
/** type: x :: Param(typeof f) */
x = 42; # constrained to int (f's parameter type)
in x
Field Access (T.key)
Extract the type of a field from an attrset type:
# Given: type Config = { name: string, age: int };
# Config.name → string
# Config.age → int
Chained access works: Config.meta.name extracts nested fields.
Cross-File Type Imports
import("./path.nix").TypeName
Import a type declaration from another file’s doc comments:
# lib.nix
/** type Input = { name: string, src: path, ... }; */
/** type Output = { name: string, system: string }; */
{ name, src, ... }: { inherit name; system = "x86_64-linux"; }
# consumer.nix
/** type: args :: import("./lib.nix").Input */
args: args.name
If the type declaration is a simple type (e.g. { name: string }), this only reads the target file’s doc comments — no inference required.
If the type declaration uses typeof (e.g. type Scope = typeof scope;), Tix runs partial inference on the target file — only the SCC groups needed to infer the referenced binding. This breaks potential cycles: even if file A imports file B at runtime, file B can still import A’s type exports as long as the typeof target doesn’t depend on B.
typeof import("./path.nix")
Reference the inferred root type of another file:
# a.nix
{ x = 1; y = "hello"; }
# b.nix
/** type: data :: typeof import("./a.nix") */
data: data.x + 1
This does require inference of the target file. The same cycle detection as regular import applies — if A typeof-imports B and B imports A, it’s an error.
Composition
All operators compose:
# Extract the parameter type of a function from another file
/** type: buildInput :: Param(typeof import("./build.nix")) */
# Access a field on an imported type
/** type: lib :: import("./scope.nix").Scope.lib */
# Chain operators
/** type: x :: Return(Return(typeof f)) */
When to Use Stubs vs Cross-File Types
| Scenario | Recommendation |
|---|---|
| External deps (nixpkgs, etc.) | .tix stub files |
| Your own project’s shared types | Doc comment type declarations + import("path").TypeName |
| Referencing inferred types within a file | typeof varname |
| Extracting parts of complex types | Param(T), Return(T), T.key |
Stubs
TLDR: .tix files declare types for external Nix code — like TypeScript’s .d.ts files. Tix ships with built-in stubs for common nixpkgs functions, and you can generate stubs from NixOS/Home Manager option trees.
What are stubs?
Nix’s import system makes full-program inference impractical. You’re not going to infer all of nixpkgs. Stubs let you declare types for code that lives outside your project.
tix inspect my-file.nix --stubs ./my-stubs/
--stubs takes a file or directory (recursively finds .tix files). Can be passed multiple times. Built-in stubs load by default (--no-default-stubs to disable).
Writing stubs
Basic syntax
# Line comments
# Type aliases — lowercase vars are implicitly generic
type Derivation = { name: string, system: string, ... };
type Nullable = a | null;
# Value declarations
val mkDerivation :: { name: string, src: path, ... } -> Derivation;
# Modules — nest values and create type aliases from the module name
module lib {
val id :: a -> a;
module strings {
val concatStringsSep :: string -> [string] -> string;
}
}
# ^ creates type alias "Lib" = { id: a -> a, strings: { concatStringsSep: ... }, ... }
Type expressions
Same syntax as doc comment annotations — see Type Annotations.
Modules create type aliases
When you write module foo { ... }, tix auto-generates a type alias Foo (capitalized) representing the attrset type of that module’s contents. This is how Lib and Pkgs work in the built-in stubs.
Top-level val declarations
Top-level val declarations (outside any module) provide types for unresolved names automatically — no annotation needed in your Nix code:
val mkDerivation :: { name: string, ... } -> Derivation;
# No annotation needed — mkDerivation is resolved from stubs
mkDerivation { name = "my-pkg"; src = ./.; }
Built-in stubs
Tix ships with stubs for common nixpkgs functions. These are compiled into the binary and loaded by default. They cover:
- Pkgs:
mkDerivation,stdenv.mkDerivation,fetchurl,fetchFromGitHub,runCommand,writeText, etc. - Lib: ~500 declarations covering
strings,lists,attrsets,trivial,fixedPoints,options,modules,fileset,filesystem,path,sources,versions,debug,generators,customisation,meta,asserts,gvariant,network, and more. Generated from noogle.dev data. - Derivation: type alias for
{ name: string, system: string, builder: path | string, ... }
Use --no-default-stubs if you want to replace them entirely with your own.
Built-in context stubs
When used in a tix.toml context, @-prefixed stub names refer to built-in context sources:
| Stub | Source | Provides |
|---|---|---|
@nixos | Compiled-in NixOS context stubs | config, lib, pkgs, options, modulesPath |
@home-manager | Compiled-in Home Manager context stubs | config, lib, pkgs, osConfig |
@callpackage | Derived from Pkgs module alias | All fields from module pkgs in the built-in stubs (stdenv, fetchurl, lib, mkDerivation, etc.) |
@callpackage doesn’t require a separate stub file. It extracts the fields of the Pkgs type alias (created by module pkgs { ... } in the built-in stubs) and provides them as context args. This is the same mechanism that any module foo { ... } declaration uses: @foo resolves to Foo.
Generating stubs from NixOS/Home Manager
Tix can generate stubs from NixOS options, Home Manager options, and nixpkgs package sets. This gives you typed access to config, lib, pkgs, and other parameters in your Nix files.
From a flake
# NixOS options
tix stubs generate nixos --flake . --hostname myhost -o nixos.tix
# Home Manager options
tix stubs generate home-manager --flake . --username jr -o hm.tix
From nixpkgs directly
tix stubs generate nixos --nixpkgs /path/to/nixpkgs -o nixos.tix
Options
| Flag | Description |
|---|---|
--flake PATH | Flake directory to evaluate |
--hostname NAME | NixOS hostname (required if multiple configurations) |
--username NAME | Home Manager username (required if multiple configurations) |
--nixpkgs PATH | Path to nixpkgs (default: <nixpkgs> from NIX_PATH) |
--from-json PATH | Read pre-computed option tree JSON instead of running nix eval |
-o, --output PATH | Output file (default: stdout) |
--max-depth N | Maximum recursion depth for option tree walking (default: 8) |
--descriptions | Include option descriptions as doc comments |
Generating pkgs stubs
For callPackage-style files, you can auto-generate val declarations for all of nixpkgs:
tix stubs generate pkgs -o generated-pkgs.tix
This evaluates nixpkgs and classifies each attribute:
- Derivations become
val hello :: Derivation; - Non-derivation attrsets become
val xorg :: { ... }; - Functions become
val callPackage :: a -> b;
Sub-package-sets like llvmPackages, python3Packages, and xorg that have recurseForDerivations = true are recursed into and emitted as nested modules:
module pkgs {
val hello :: Derivation;
module python313Packages {
val numpy :: Derivation;
val pandas :: Derivation;
}
val python3Packages :: Python313Packages;
val writeText :: a -> b;
}
Alias detection: Nixpkgs uses dontRecurseIntoAttrs on alias package sets (e.g. python3Packages = dontRecurseIntoAttrs python313Packages). When a non-recursed attrset has recurseForDerivations explicitly set to false and its builtins.attrNames matches a recursed sibling, tix emits a type alias reference (val python3Packages :: Python313Packages;) instead of an opaque { ... }. This gives alias sets the same typed fields as their targets.
Use --max-depth to control recursion depth (default: 1). Higher values give more coverage but increase eval time — python3Packages alone has ~10k attributes. Use --max-depth 0 for flat output (no recursion).
The output is a module pkgs { ... } block that merges with the hand-curated module pkgs in the built-in stubs, extending the Pkgs type alias with thousands of additional fields. Since @callpackage derives its context from Pkgs, the generated packages are picked up automatically.
# Generate from specific nixpkgs
tix stubs generate pkgs --nixpkgs /path/to/nixpkgs -o generated-pkgs.tix
# Recurse deeper into sub-package-sets
tix stubs generate pkgs --max-depth 2 -o generated-pkgs.tix
# Flat output (no sub-package-set recursion, like pre-v0.x behavior)
tix stubs generate pkgs --max-depth 0 -o generated-pkgs.tix
# From pre-computed JSON (for reproducibility or CI)
tix stubs generate pkgs --from-json classified.json -o generated-pkgs.tix
Load the generated file via --stubs or the stubs config key:
stubs = ["./generated-pkgs.tix"]
[context.callpackage]
includes = ["pkgs/**/*.nix"]
stubs = ["@callpackage"]
Using generated stubs with tix.toml
Once generated, point your tix.toml at them. See Configuration.
Source annotations
Stub declarations can carry @source annotations that link back to the
original source file. When present, go-to-definition in the LSP jumps
directly to the nixpkgs (or home-manager) source instead of landing in
the generated .tix file.
Syntax
@source <source-id>:<relative-path>:<line>:<column>
For example:
@source nixpkgs:lib/trivial.nix:61:8
val id :: a -> a;
@source can appear before val, type, and module declarations, as
well as on individual attrset fields (used in NixOS/Home Manager option stubs).
How it works
When stubs are generated with --source-root nixpkgs=/nix/store/...-source,
absolute Nix store paths from builtins.unsafeGetAttrPos are stripped against
the root to produce relative paths. At LSP startup the source root is resolved
(typically from the flake lock) so that go-to-definition can open the real file.
When using [stubs.generate] in tix.toml, source roots are passed
automatically — no manual --source-root flags needed.
Using stubs in your code
Assign stub types to imports via doc comments:
let
/** type: lib :: Lib */
lib = import <nixpkgs/lib>;
/** type: pkgs :: Pkgs */
pkgs = import <nixpkgs> {};
greeting = lib.strings.concatStringsSep ", " ["hello" "world"];
drv = pkgs.stdenv.mkDerivation { name = "my-package"; src = ./.; };
in
{ inherit greeting drv; }
Now lib.strings.concatStringsSep is typed as string -> [string] -> string, and drv is typed as Derivation.
Configuration
TLDR: tix.toml maps file paths to contexts (like @nixos or @home-manager), controlling which stubs get loaded and how module parameters are typed.
tix.toml
Tix auto-discovers tix.toml by walking up from the file being checked. You can also pass --config path/to/tix.toml explicitly.
Contexts
A context tells tix “files matching these paths are NixOS modules (or Home Manager modules, etc.)” so it knows how to type the standard { config, lib, pkgs, ... }: parameter pattern.
[context.nixos]
includes = ["modules/*.nix", "hosts/**/*.nix"]
stubs = ["@nixos"]
[context.home-manager]
includes = ["home/*.nix"]
stubs = ["@home-manager"]
- includes — glob patterns matching files in this context
- excludes — glob patterns for files to exclude even when
includesmatches. Useful when a broad glob likedir/**/*.nixcovers a directory with a few files that belong to a different context. - stubs — which stub sets to load.
@nixosand@home-managerare built-in references to the generated NixOS/Home Manager stubs (requires[stubs.generate]or theTIX_BUILTIN_STUBSenv var)
For example, if most files under common/ are NixOS modules but common/homemanager/ contains Home Manager modules:
[context.nixos]
includes = ["common/**/*.nix", "hosts/**/*.nix"]
excludes = ["common/homemanager/**/*.nix"]
stubs = ["@nixos"]
[context.home-manager]
includes = ["common/homemanager/**/*.nix"]
stubs = ["@home-manager"]
tix init generates excludes patterns automatically when it detects mixed-kind directories.
What contexts do
When a file matches a context, tix automatically types the module’s function parameters. A NixOS module like:
{ config, lib, pkgs, ... }:
{
services.foo.enable = true;
}
Gets config, lib, and pkgs typed according to the context’s stubs, without any doc comment annotations in the file.
callPackage / dependency-injected files
For files loaded via callPackage or import that take a package set as their parameter:
[context.callpackage]
includes = ["pkgs/**/*.nix"]
stubs = ["@callpackage"]
@callpackage derives its types from the built-in Pkgs module (the same one that types pkgs.stdenv.mkDerivation, pkgs.fetchurl, etc.). Parameters not covered by the built-in stubs remain untyped. For broader coverage, generate pkgs stubs and load them via --stubs or the stubs config key — they merge into the Pkgs type alias automatically.
Inline context annotation
You can also set context per-file with a doc comment at the top:
/** context: nixos */
{ config, lib, pkgs, ... }:
{
# ...
}
Project settings
The [project] section configures project-level behavior for both the LSP and tix check.
[project]
includes = ["lib/*.nix", "pkgs/**/*.nix"]
excludes = ["result", ".direnv", "vendor/**"]
- includes — glob patterns for files to include in analysis. When the LSP starts, these files are analyzed in the background and their inferred types become ephemeral stubs available to all open files.
- excludes — glob patterns for files/directories to skip during
tix check. Excluded files are fully skipped from discovery. Hardcoded ignores (.git,node_modules,result,.direnv,target) are always applied.
tix init generates a [project] section with sensible defaults.
Files matching includes are processed in the background after LSP initialization. As each file’s type is inferred, any open files that import it are automatically re-analyzed with the updated type information.
Suppression directives
Tix supports TypeScript-style comment directives for suppressing diagnostics:
# tix-nocheck
Suppresses all diagnostics for the entire file. Place anywhere in the file:
# tix-nocheck
{ config, lib, pkgs, ... }:
{
# This file will not report any type errors
services.foo.enable = 42;
}
# tix-ignore
Suppresses diagnostics on the next line only:
let
# tix-ignore
x = (1 + 2).foo; # no error reported for this line
y = (3 + 4).bar; # this line still reports errors
in
x
Diagnostics
Control the severity of optional diagnostics. Currently the only configurable diagnostic is unknown_type (E014), which fires when a binding has type ?.
[diagnostics]
unknown_type = "hint" # "off", "hint", "warning", or "error" (default: "hint")
The LSP editor settings (tix.diagnostics.unknownType) take precedence over tix.toml when both are set.
Runtime stub generation
Tix can generate full NixOS, Home Manager, and pkgs stubs at runtime on first use. The result is cached in the Nix store and reused on subsequent runs.
[stubs.generate]
nixpkgs = "/nix/store/...-nixpkgs-src"
home-manager = "/nix/store/...-home-manager-src"
Each source can be a direct store path or a Nix expression:
[stubs.generate]
nixpkgs = { expr = "(builtins.getFlake (toString ./.)).inputs.nixpkgs" }
home-manager = { expr = "(builtins.getFlake (toString ./.)).inputs.home-manager" }
- nixpkgs (required) — path to nixpkgs source, or
{ expr = "..." }to evaluate - home-manager (optional) — path to home-manager source; omit to skip HM stubs
On first run, tix invokes nix build to generate .tix stubs from the NixOS option tree, Home Manager options, and nixpkgs package set. This takes 30-60 seconds. Subsequent runs are instant thanks to a lightweight file cache (~/.cache/tix/store-stubs/). Changing either nixpkgs or tix version triggers regeneration.
[stubs.generate] can coexist with manual stub paths:
[stubs]
paths = ["./my-extra-stubs/"]
[stubs.generate]
nixpkgs = { expr = "(builtins.getFlake (toString ./.)).inputs.nixpkgs" }
Resolution priority:
TIX_BUILTIN_STUBSenv var (always wins)[stubs.generate]runtime generation- Compiled-in minimal stubs
Requirements: The tix binary must be installed via Nix (running from /nix/store/...). In dev mode (cargo build), use TIX_BUILTIN_STUBS or nix build .#stubs instead.
Generating tix.toml
Run tix init to automatically generate a tix.toml for your project:
tix init # Generate tix.toml in current project
tix init --dry-run # Preview without writing
tix init --yes # Overwrite existing tix.toml
tix init /path/to/project # Specify project directory
The command scans all .nix files, classifies each by its structural signals (parameter names, body references, attrset keys), and generates context sections mapping file paths to the appropriate stubs. For flake projects, it also auto-detects nixpkgs and home-manager inputs from flake.lock and generates the [stubs.generate] section.
No-module escape hatch
If tix incorrectly treats a file as a module, add this comment to disable module-aware features:
/** no-module */
LSP
TLDR: tix lsp provides IDE features over the Language Server Protocol. Run it, point your editor at it.
Running
tix lsp
Communicates over stdin/stdout. Stubs are loaded from tix.toml (auto-discovered from the workspace root) and editor settings.
Features
| Feature | What it does |
|---|---|
| Hover | Shows inferred type and doc comments |
| Completion | Attrset field access (.), function args, expected-type fields in nested values (list elements, nested attrsets), identifiers, inherit targets |
| Signature Help | Parameter names and types when calling functions; highlights the active parameter for curried calls |
| Go to Definition | Jump to let bindings, lambda params, imports, cross-file field definitions (including callPackage-style patterns and transitive barrel re-exports), any path literal (including directory→default.nix resolution), and NixOS/Home Manager config option definitions via @source annotations |
| Go to Type Definition | Jump to the .tix stub file where a type alias is declared, or to the original source of a config field via @source. Works on any name or expression whose inferred type is a named alias (e.g. Derivation, Lib). Only available for stubs loaded from disk. |
| Find References | All uses of a name in the file, plus cross-file usages (x.name in files that import this file) |
| Rename | Refactor bindings and their references; cross-file rename updates x.field select expressions in open files that import the renamed file |
| Inlay Hints | Inline type annotations after binding names |
| Document Symbols | Outline of let bindings and lambda params |
| Workspace Symbols | Search for symbols across all open files |
| Document Links | Clickable import and callPackage paths |
| Semantic Tokens | Syntax highlighting based on name kind |
| Selection Range | Smart expand/shrink selection |
| Document Highlight | Highlight all uses of the name under cursor |
| Code Actions | Quick fixes: add missing field, add type annotation, remove unused binding |
| Formatting | Runs nixfmt |
| Diagnostics | Type errors, missing fields, import resolution errors — each with a stable error code |
Diagnostics
When diagnostics are enabled ("diagnostics": { "enable": true }), tix reports:
- Type errors (ERROR): type mismatches (E001), invalid operators (E003), invalid attrset merges (E004)
- Missing fields (ERROR): accessing a field that doesn’t exist on a closed attrset (E002)
- Unresolved names (WARNING): references to names that can’t be resolved (E005)
- Import errors (WARNING):
import ./missing.nixwhere the target file doesn’t exist (E007), angle bracket imports like<nixpkgs>(E012), or files that haven’t been analyzed (E013) - Inference aborted (WARNING): when type inference is aborted due to memory pressure (E008)
- Unknown type (configurable): bindings whose type is
?(E014) — default severity: hint
Every diagnostic has a stable error code (e.g. E001) that links to documentation. In VS Code, click the code in the Problems panel to open the docs page.
Import errors appear at the import expression so you can see which import failed and why. The CLI (tix) shows the same diagnostics with error codes in Rust-style format: error[E001]: message.
Code Actions
Code actions (quick fixes / refactorings) are offered based on diagnostics and cursor position:
-
Add missing field (quick fix): when you access a field that doesn’t exist on a closed attrset (e.g.
x.barwherex = { foo = 1; }), offers to insertbar = throw "TODO";into the attrset definition. Only works when the attrset definition is visible in the same file. -
Add type annotation (refactor): when the cursor is on a let-binding or rec-attrset field that has an inferred type, offers to insert a
/** type: name :: <type> */doc comment above the binding. Skipped if an annotation already exists. -
Remove unused binding (quick fix): when a let-binding has no references in the file, offers to remove the entire
name = value;line. Names starting with_are excluded (conventional “unused” prefix in Nix).
CLI flags
--log-level
Controls the log level for tix crates (default: info). Useful for debugging background analysis, import resolution, or inference behavior. The RUST_LOG environment variable takes precedence if set.
tix lsp --log-level debug # see per-file background analysis, import details
tix lsp --log-level warn # quieter, only warnings and errors
tix lsp --log-level trace # maximum verbosity
--mem-limit
The LSP sets an RSS (resident memory) limit at startup to prevent runaway inference from consuming all system memory. The default is 80% of system RAM (detected via sysconf; falls back to 3200 MiB if detection fails). A hard RLIMIT_AS backstop is set to 2.5× the RSS limit to accommodate virtual address space overhead.
Override with the --mem-limit flag (value in MiB, sets the RSS limit directly) or the TIX_MEM_LIMIT environment variable:
tix lsp --mem-limit 8192 # 8 GiB RSS limit
tix lsp --mem-limit 0 # no limit
TIX_MEM_LIMIT=8192 tix lsp # 8 GiB (env var, lower priority than --mem-limit)
When process RSS exceeds the limit, inference bails out early — returning partial results instead of crashing. Background analysis of project files is also paused when RSS is high.
Editor setup
VS Code
Install the Nix IDE extension, then configure it to use tix lsp.
Minimal setup — add to your .vscode/settings.json (workspace) or user settings:
{
"nix.enableLanguageServer": true,
"nix.serverPath": ["tix", "lsp"]
}
With extra stubs and initialization options:
{
"nix.enableLanguageServer": true,
"nix.serverPath": ["tix", "lsp"],
"nix.serverSettings": {
"stubs": ["./my-stubs"],
"inlayHints": { "enable": true },
"diagnostics": { "enable": true, "unknownType": "hint" }
}
}
Neovim (nvim-lspconfig)
vim.api.nvim_create_autocmd("FileType", {
pattern = "nix",
callback = function()
vim.lsp.start({
name = "tix",
cmd = { "tix", "lsp" },
})
end,
})
Initialization options
The LSP accepts configuration via initializationOptions. How you pass these depends on your editor — in VS Code they go under nix.serverSettings, in Neovim they go in the init_options field of vim.lsp.start():
{
"stubs": ["/path/to/extra/stubs"],
"inlayHints": { "enable": true },
"diagnostics": { "enable": true }
}
Limitations
Things tix doesn’t support yet, or handles imperfectly.
Language features
with blocks
Nested with blocks resolve names inner-to-outer, matching Nix runtime semantics:
with a; with b;
# names resolve against b first, then a if b doesn't have the field
x
If no with scope has the field, a MissingField error is reported.
Literal / singleton types
Tix doesn’t have literal types. "circle" is typed as string, not as the literal "circle". This means you can’t do TypeScript-style discriminated unions:
# tix sees this as { type: string, radius: int } | { type: string, width: int }
# not { type: "circle", ... } | { type: "rect", ... }
Enum option types in generated NixOS stubs also become string for this reason.
Dynamic field access
Dynamic attrset field access (x.${name}) uses a general dynamic field type but can’t track which specific field is being accessed.
Type narrowing
Narrowing works well for most common patterns (see Type System), but has some gaps:
- Structural predicates (
isAttrs,isList,isFunction) only narrow in the then-branch. The else-branch doesn’t exclude these types. - Multi-element attrpaths:
x ? a.b.cdoesn’t narrow — only single-keyx ? fieldworks. - Value equality:
if x == "foo"doesn’t narrowxto the literal"foo"(no literal types). - Overloaded function annotations: intersection-type annotations (e.g.
(int -> int) & (string -> string)) are trusted, not verified per-branch. - Recursive narrowing: using
isFunction xin one branch and recursing from another can cause false positives because both branches share the same type variable.
Cross-file inference
- Imports without stubs are inferred as
any(the top type). For precise cross-file types, use[project] analyzeintix.tomlor write.tixstubs. - Overloaded operators (like
+with polymorphic arguments) don’t survive file boundaries. If a generic function using+is imported from another file, the overload may not resolve correctly.
Recursive attrsets
rec { ... } works but types that refer to themselves can produce verbose output in some cases.
Stubs
- The built-in lib stubs cover common functions but not all of nixpkgs lib. Unstubbed functions get a fresh type variable (no error, just no type info).
- Home Manager flake mode stub generation is less tested than NixOS.
- lib function stubs are curated from noogle.dev data — there’s no auto-generation from nixpkgs lib source yet.
Diagnostics Reference
Every tix diagnostic has a stable error code. Codes never change meaning once assigned.
| Code | Severity | Description |
|---|---|---|
| E001 | Error | Type mismatch |
| E002 | Error | Missing field |
| E003 | Error | Invalid binary operator |
| E004 | Error | Invalid attribute set merge |
| E005 | Warning | Unresolved name |
| E006 | Warning | Duplicate key |
| E007 | Warning | Import target not found |
| E008 | Warning | Inference aborted |
| E009 | Warning | Annotation arity mismatch |
| E010 | Warning | Annotation accepted but not verified |
| E011 | Warning | Annotation parse error |
| E012 | Warning | Angle bracket import |
| E013 | Hint | Imported file not analyzed |
| E014 | Configurable | Type could not be inferred |
| E015 | Error | Invalid string interpolation |
E001: Type Mismatch
Severity: Error
type mismatch: expected `string`, got `int`
A value was used where a different type was expected. This is the most common error – it means the inferred type of an expression is incompatible with how it’s being used.
Common causes
- Passing an argument of the wrong type to a function:
let f = x: x + "hello"; in f 42 # f expects a string (because of ++), but got int - Returning inconsistent types from
if/elsebranches when the consumer expects a specific type. - Accessing a field and using it as the wrong type.
Hints
E001 may include a contextual hint when tix recognises a common pattern:
-
String coercion — When a non-string value (e.g., a derivation or path) is passed where
stringis expected, Nix would silently coerce it at runtime. Tix flags this so you can make the conversion explicit:# Before (E001): lib.optionalString true myDerivation # Fix with toString: lib.optionalString true (toString myDerivation) # Or with string interpolation (works for paths and derivations): lib.optionalString true "${myDerivation}"String interpolation (
"${...}") works for paths and attrsets (derivations) but not forint,bool,float, ornull— usetoStringfor those. -
String interpolation as path — When a string interpolation like
"${expr}/suffix"is used where apathis expected, tix suggests using path concatenation (expr + "/suffix") instead.
How to fix
- Check the expected and actual types in the error message and trace back which expression produces the wrong type.
- If both branches of an
if/elseshould return the same type, make sure they do. - If the mismatch comes from an import, add a type annotation to clarify the imported value’s type.
E002: Missing Field
Severity: Error
missing field `naem`, did you mean `name`?
An attribute set access (x.field or x.field or default) refers to a field that does not exist on the inferred type of the set. When a similarly-named field exists, tix suggests it.
Common causes
- Typo in the field name.
- Accessing a field that exists in some code paths but not others (e.g. the set comes from a branch that doesn’t always include the field).
- The attribute set was declared as closed (no
...) and the field was omitted.
How to fix
- Check the suggested correction if one is shown.
- If the attribute set should accept arbitrary fields, mark it as open with
...:{ name, version, ... }: name - If the field is conditionally present, use
x ? fieldto guard the access or usex.field or defaultValue.
E003: Invalid Binary Operator
Severity: Error
cannot apply `+` to `string` and `int`
A binary operator was applied to operands whose types don’t support that operation. Nix’s + is overloaded (works on ints, floats, strings, and paths), but the two operands must be compatible.
Common causes
- Mixing strings and numbers with
+without converting:"count: " + 42 # string + int is not allowed - Using arithmetic operators (
-,*,/) on non-numeric types. - Using
//(attribute set merge) where+(addition or concatenation) was intended, or vice versa.
How to fix
- For string concatenation with non-strings, use string interpolation:
"count: ${toString 42}" - Make sure both operands of arithmetic operators are numeric (
intorfloat). - Use
++for list concatenation,//for attribute set merging.
E004: Invalid Attribute Set Merge
Severity: Error
cannot merge `int` with `{ name: string }`: both sides must be attribute sets
The // (update) operator requires both operands to be attribute sets. This error appears when one or both sides have a non-attrset type.
Common causes
- Using
//on a value that isn’t an attribute set:42 // { x = 1; } # int is not an attrset - A variable intended to hold an attrset actually holds a different type due to a logic error upstream.
- Merging the return value of a function that doesn’t return an attrset.
How to fix
- Verify that both sides of
//are attribute sets. - If one side is conditional, make sure all branches return an attrset:
(if cond then { a = 1; } else {}) // { b = 2; } - Check whether you meant
+(concatenation/addition) instead of//(merge).
E005: Unresolved Name
Severity: Warning
unresolved name `pkgs`
A variable name was referenced but could not be found in any enclosing scope. The resulting type is unconstrained (?).
Common causes
- The binding is defined in a scope that isn’t visible (e.g. a different
letblock). - The name comes from an import or function parameter that tix can’t resolve, such as NixOS module arguments (
config,pkgs,lib). - Typo in the variable name.
How to fix
- If it’s a module parameter, configure a context in
tix.tomlso tix knows the parameter types:[context.nixos] includes = ["modules/*.nix"] stubs = ["@nixos"] - If it comes from an unresolvable import, add a type annotation:
/** type: pkgs :: Pkgs */ pkgs = import <nixpkgs> {}; - Check for typos in the variable name.
E006: Duplicate Key
Severity: Warning
duplicate key `name` in binding set
The same key appears more than once in a let block or attribute set. Nix silently allows this (the last definition wins), but it is almost always a mistake.
Common causes
- Copy-paste error where a binding was duplicated.
- Two
inheritclauses pulling in the same name. - A large attrset where the same key was defined in different sections.
How to fix
- Remove the duplicate definition. The diagnostic points to the second occurrence and links to the first.
- If both definitions are intentional (rare), restructure the code so the intent is clear – e.g. use
//to explicitly override.
E007: Import Not Found
Severity: Warning
import target not found: ./missing.nix
An import expression references a file path that does not exist on disk.
Common causes
- The imported file was moved or renamed without updating the import.
- The path is relative and the working directory differs from what was expected.
- The file hasn’t been created yet.
How to fix
- Verify the file path is correct relative to the importing file.
- If the file is generated or fetched at build time, tix won’t see it during static analysis. Add a type annotation instead:
/** type: generated :: { version: string, ... } */ generated = import ./generated.nix;
E008: Inference Aborted
Severity: Warning
type inference aborted (memory limit exceeded) -- missing types for: `bigFunction`, `helper`
Type inference for this file was aborted because the process exceeded the memory (RSS) limit. Bindings that were inferred before the limit was reached still have their types; only the remaining bindings are affected.
Common causes
- Very large files with many bindings or deeply nested expressions that consume excessive memory.
- Highly polymorphic recursive functions that cause the type graph to grow beyond the memory budget.
How to fix
- Split large files into smaller modules. This also improves incremental re-checking.
- Add type annotations to complex bindings to reduce the work the inference engine needs to do.
E009: Annotation Arity Mismatch
Severity: Warning
annotation for `add` has arity 1 but expression has 2 parameters; skipping
The number of arrows in the type annotation is less than the number of visible lambda parameters on the function. The annotation is skipped and inference proceeds without it.
Common causes
- The annotation has fewer arrows than the function has parameters:
/** type: add :: int -> int */ add = x: y: x + y; # 2 params, but annotation has 1 arrow - The function was refactored to take more parameters without updating the annotation.
How to fix
- Update the annotation to match the function’s arity:
/** type: add :: int -> int -> int */ add = x: y: x + y; - Note: an annotation with more arrows than visible parameters is fine – the body may return a function.
E010: Annotation Accepted but Not Verified
Severity: Warning
annotation for `dispatch` accepted but not verified: intersection-of-function annotations
are accepted as declared types but not verified against the body
The type annotation was accepted and will be used as the binding’s type for callers, but tix did not verify that the function body actually conforms to the annotation.
Common causes
- Intersection type annotations (overloaded function signatures):
/** type: dispatch :: (int -> int) & (string -> string) */ dispatch = x: if builtins.isInt x then x + 1 else x + "!"; - Union types in annotations are also currently trusted without verification.
How to fix
- This warning is informational – it tells you the annotation is trusted, not checked. No fix is required.
- If you want verification, simplify the annotation to a non-intersection type where possible.
- Make sure your implementation actually matches the declared overloads, since tix won’t catch mismatches here.
E011: Annotation Parse Error
Severity: Warning
type annotation for `helper` failed to parse: expected `->` or end of input
A doc comment type annotation was recognized (it starts with type: or has a # Type section) but the type expression could not be parsed.
Common causes
- Syntax error in the type expression:
/** type: f :: int -> -> string */ # double arrow - Using syntax that isn’t supported in tix type expressions (e.g. Haskell-style type classes).
- Missing closing bracket or paren:
/** type: f :: [int -> string */ # missing ]
How to fix
- Check the type expression syntax for valid syntax.
- Fix the parse error indicated in the message. Common fixes:
/** type: f :: int -> string */ # simple function /** type: g :: { name: string, ... } */ # open attrset /** type: h :: (int | string) -> bool */ # union needs parens before ->
E012: Angle Bracket Import
Severity: Warning
cannot resolve angle bracket import `<nixpkgs>` -- add a type annotation or stub
An import <path> expression uses an angle bracket path, which requires NIX_PATH resolution. Tix does not implement NIX_PATH lookup, so the imported value’s type is unknown.
Automatically resolved paths
When the default stubs are loaded, tix automatically resolves these common angle bracket imports:
| Path | Resolved type |
|---|---|
<nixpkgs> | { ... } -> Pkgs — the nixpkgs top-level function |
<nixpkgs/lib> | Lib — the nixpkgs lib attrset |
These use the Pkgs and Lib type aliases from the built-in stubs (stubs/lib.tix). No annotation or configuration is needed for these imports — they just work.
Common causes
This warning appears for angle bracket imports that tix cannot resolve automatically:
- Custom Nix search path entries like
<unstable>,<nixos>, or<home-manager>. - Nixpkgs subpaths other than
lib, such as<nixpkgs/nixos/lib/eval-config.nix>.
How to fix
- Add a type annotation to the binding:
/** type: pkgs :: Pkgs */ pkgs = import <nixpkgs> {}; /** type: lib :: Lib */ lib = import <nixpkgs/lib>; - Or configure a context in
tix.tomlso module parameters are typed automatically, avoiding the need to import with angle brackets in the first place. - Load stubs that define the
PkgsandLibtype aliases.
E013: Imported File Not Analyzed
Severity: Hint
imported file `./utils.nix` has not been analyzed -- add it to [project] analyze in
tix.toml or open it in the editor
An import ./path.nix resolved to a real file, but that file hasn’t been type-checked yet. The imported value’s type is unconstrained (?). This is a hint, not a warning – it’s normal in unconfigured projects.
When does this appear?
In most cases, the LSP resolves imports automatically: when you open a file that imports another, the imported file is inferred from disk on demand. E013 only appears when demand-driven inference fails or is unavailable – for example, if the imported file has a parse error that prevents inference.
Common causes
- The imported file has syntax errors that prevent parsing.
- The import target is generated or doesn’t exist on disk yet.
How to fix
- Fix any syntax errors in the imported file.
- If the file is generated, add it to the analyze list so it’s pre-analyzed:
[project] analyze = ["lib/*.nix", "utils/*.nix"] - Or open the imported file in your editor – the LSP will analyze it and re-check dependents automatically.
E014: Unknown Type
Severity: Configurable (default: hint)
type of `result` could not be inferred -- consider adding a type annotation or stub
A non-parameter binding has an unconstrained type variable (?), meaning tix couldn’t determine its type from usage. Parameter bindings are excluded – a bare type variable on a parameter is normal (it means the parameter is polymorphic).
Common causes
- The binding’s value comes from an unresolved import or an angle-bracket import.
- The binding is assigned from a function whose return type is unknown.
- The binding is unused, so there are no constraints to infer from.
How to fix
- Add a type annotation:
/** type: result :: { name: string, ... } */ result = someUnknownFunction arg; - Load stubs that define the types of external dependencies.
- If the binding is unused, consider removing it.
Configuring severity
E014 severity is controlled by LSP settings. In VS Code:
{
"tix.diagnostics.unknownType": "hint" // "error", "warning", "hint", or "off"
}
Set to "off" to suppress these diagnostics entirely.
E015: Invalid String Interpolation
Severity: Error
`int` cannot be used in string interpolation; use `toString` to convert it explicitly
A value of a type that Nix cannot interpolate was used inside "${...}". Nix string interpolation only works for:
- strings (identity)
- paths (converted to absolute path string)
- derivations (attrsets with
outPath— converted to store path)
It does not work for int, bool, float, null, lists, or functions — these cause a runtime error like cannot coerce an integer to a string.
How to fix
Wrap the expression in toString:
# Before (E015):
"count: ${1 + 2}"
# After:
"count: ${toString (1 + 2)}"
The LSP offers a quick fix to insert toString automatically.
Internals
This section covers implementation details for contributors and anyone curious about how tix works under the hood. None of this is needed to use tix effectively.
Workspace crates
Six crates under crates/, listed in pipeline order:
| Crate | Role |
|---|---|
lang_ast | Parse Nix via rnix, lower to Tix AST, name resolution, SCC grouping |
lang_ty | Type representation: Ty<R, VarType> during inference, OutputTy for display |
comment_parser | Parse type annotations from doc comments and .tix stub files |
lang_check | SimpleSub type inference engine — the core of the project |
lsp | LSP server: hover, completions, go-to-def, diagnostics, rename, etc. |
cli | CLI entry point, project-level batch checking |
Pipeline overview
Type-checking a Nix file flows through six phases:
flowchart TD
subgraph lang_ast
A[Nix source] --> B["① Parse & lower
rnix CST → Tix AST (Expr/Name arenas)
+ source maps (AstPtr ↔ ExprId)"]
B --> C["② Name resolution
scope tree, reference → definition"]
C --> D["③ SCC grouping
Tarjan's on binding dependency graph"]
end
subgraph lang_check
D --> E["④ Type inference
a. Pre-allocate TyIds for all names/exprs
b. Apply stub/annotation types
c. Per SCC: infer → constrain → extrude
d. Infer root expression"]
E --> F["⑤ Canonicalize
Ty‹TyId› → OutputTy (polarity-aware)"]
end
F --> G["⑥ Output
CLI prints types / LSP serves requests"]
Phase 1: Parse & lower
Entry: lang_ast::module_and_source_maps(db, file)
Nix source is parsed by rnix into a Rowan CST, then lowered to Tix’s own AST. The AST uses arena allocation — every expression and name gets an ExprId / NameId index into flat vectors. A bidirectional ModuleSourceMap links AST nodes back to source positions for LSP features and error reporting. Doc comments are gathered during lowering and inline type aliases (type Foo = ...;) are extracted.
Phase 2: Name resolution
Entry: lang_ast::name_resolution(db, file)
Two sub-phases:
- Scope building — walks the AST to create a scope tree. Each
let, recursive attrset, and lambda introduces a scope with its defined names.withexpressions create special scopes that defer lookup to the environment value. - Reference resolution — for each
Expr::Reference, looks up the name through ancestor scopes. Results are one of: a local definition (NameId), a builtin (e.g.null,map), awith-environment lookup, or unresolved. A reverse index (NameId → Vec<ExprId>) is also built for find-references / rename.
Phase 3: SCC grouping
Entry: lang_ast::group_def(db, file)
Builds a dependency graph between bindings (which name references which other name) and runs Tarjan’s algorithm to compute strongly connected components. Each SCC becomes a DependentGroup — a set of mutually-recursive definitions that must be inferred together. Non-recursive bindings get their own single-element group. Groups are topologically sorted so each group is inferred only after its dependencies.
Phase 4: Type inference
Entry: lang_check::check_file(db, file)
This is where SimpleSub runs. The main orchestrator (CheckCtx::infer_prog_partial) proceeds in stages:
Pre-allocation. A fresh TyId is allocated for every name and expression in the module upfront. This lets recursive definitions reference types before they’re fully inferred.
Stub application. If the entry expression is a lambda with doc-comment annotations (e.g. /** type: lib :: Lib */), those types are applied to the parameter slots before inference begins, so they flow into all downstream bindings.
Per-SCC iteration. For each group:
- Enter a new binding level for let-polymorphism.
- Infer each definition via
infer_expr— a single-pass walk over the AST that allocates type variables and callsconstrain(sub, sup)inline as it discovers subtyping relationships. - Resolve deferred constraints — overloaded operators (
+,*, etc.),with-environment lookups, and attrset merges are resolved once enough type information has accumulated. - Extrude and generalize — variables created at this level are copied to fresh variables at the parent level with bounds linked via constraints. This is SimpleSub’s replacement for the traditional HM generalize/instantiate pair.
Root inference. The module’s entry expression is inferred and any remaining pending constraints are resolved.
Phase 5: Canonicalization
Entry: Collector::finalize_inference()
Converts the internal bounds-based representation (Ty<TyId>) to a display-ready OutputTy tree. This is polarity-aware:
- Positive positions (outputs, covariant) — a variable expands to the union of its lower bounds. A variable bounded by
{int, string}becomesint | string. - Negative positions (inputs, contravariant) — a variable expands to the intersection of its upper bounds.
Negation types are normalized using Boolean algebra (De Morgan, double-negation elimination, contradiction/tautology detection). The result is an InferenceResult mapping every NameId and ExprId to its OutputTy.
Phase 6: Output
The CLI prints binding types and the root expression type. The LSP serves the InferenceResult to power hover, completions, diagnostics, inlay hints, and other features.
Cross-file inference
When a file contains import ./other.nix, tix resolves it demand-driven:
- Import scanning —
scan_literal_imports()finds literalimport <path>patterns. Dynamic imports (where the path is computed) remain unconstrained. - Demand-driven analysis — an
InferenceCoordinatormanages concurrent file inference. When file A imports file B, B is inferred first (with cycle detection). The coordinator handles parallelism via rayon for batch project checking (tix check). - Type integration — the imported file’s root
OutputTyis wrapped in aTy::Frozen(Arc<OutputTy>)— a single TyId that lazily materializes fields on demand. When the importer accesseslib.strings, only that field’s type is interned; the other 493 fields remain frozen. This prevents O(N) TyId allocations for large imports. ForcallPackage ./file.nix {}patterns, tix recognizes the convention and peels the outer lambda layer.
Files outside the project scope (e.g. transitive nixpkgs imports) get ⊤ — inference stays local to the project boundary.
Layered inference in tix check
In batch mode (tix check), files are inferred in topological layers based on their import dependencies:
- Import scanning — during Phase 1 (sequential prepare), each file’s import targets are scanned to build a file-level dependency graph.
- SCC computation + layering — Tarjan’s algorithm computes strongly-connected components (SCCs), then a condensation DAG is topologically sorted into layers. Layer 0 contains leaf files (no in-project dependencies); each subsequent layer depends only on prior layers.
- Layer-by-layer inference — files within each layer run in parallel via rayon. Dependencies from prior layers have their signatures cached in the
InferenceCoordinator, so imports resolve to real types instead of⊤. Files within the same SCC (mutual imports) get⊤for intra-SCC imports. - Reference-counted eviction — after each layer, signatures whose importers have all been processed are evicted from the cache, keeping memory bounded to the dependency “frontier” rather than the entire project.
Stub integration
.tix stub files provide types for code that can’t be inferred (nixpkgs lib, etc.). The TypeAliasRegistry in aliases.rs loads stubs from three sources:
- Built-in stubs (
stubs/lib.tix) — shipped with tix, covering core nixpkgs lib functions. - Project stubs — loaded from
--stubsCLI flags ortix.tomlconfig. - Inline aliases —
type Foo = ...;declarations in doc comments, merged into the registry during inference.
Top-level val declarations (e.g. val mkDerivation :: ...) provide types for unresolved names automatically — no annotation needed. Module blocks (module lib { ... }) auto-generate a capitalized type alias (Lib) for use in doc-comment annotations.
Type theory background
What SimpleSub gives us
Most type inference algorithms make you choose: you can have subtyping (like TypeScript, where int is assignable to int | string) or you can have full inference (like ML/Haskell, where the compiler figures out all the types). SimpleSub gets both.
Concretely, tix’s type system provides:
- Type inference — types are inferred from usage, not declared. You write
x: !xand tix infersbool -> bool. - Subtyping — a
{ name: string, age: int }can be passed where{ name: string, ... }is expected. A function returningintcan be used whereint | stringis expected. Types have a natural “is-a” relationship. - Parametric polymorphism — a function like
id = x: xgets a generic typea -> athat works for any type, not a single concrete type. - Let generalization — each
letbinding gets its own polymorphic type, soidcan be applied to bothintandstringin the same scope without conflict. - Union and intersection types —
if cond then 1 else "hi"isint | string(a union). A function parameter constrained to be both a number and a string gets an intersection type, which simplifies tonever(uninhabited) — indicating a type error. - Row polymorphism —
getName = x: x.nameaccepts any attrset with anamefield. The type is{ name: a, ... } -> a— the...means “other fields are allowed.”
The key insight of SimpleSub is that subtyping constraints can be recorded as bounds on type variables (lower bounds for what flows in, upper bounds for what flows out) and resolved lazily during canonicalization. This avoids the complexity of traditional constraint solvers while keeping inference complete.
Tix extends SimpleSub with Boolean-Algebraic Subtyping (BAS) from Chau & Parreaux (POPL 2026), which adds negation types (~null, ~string) for type narrowing in conditional branches.
For the full theory, see Parreaux’s The Simple Essence of Algebraic Subtyping (ICFP 2020).
Key design decisions
- Bounds-based variables, not union-find: type variables store upper/lower bounds;
constrain(sub, sup)propagates bounds inline (no separate solve phase). - Extrude replaces instantiate/generalize: deep-level variables are copied to fresh variables at the current level with bounds linked via subtyping constraints. This is SimpleSub’s key insight — it replaces the traditional Hindley-Milner generalize/instantiate pair with a single operation.
- Two type representations:
Ty<R, VarType>during inference (includesNeg,Inter,Unionfor narrowing);OutputTyafter canonicalization (has Union/Intersection/Neg). - Polarity-aware canonicalization: positive positions expand to union of lower bounds; negative positions expand to intersection of upper bounds.
- SCC grouping: mutually recursive bindings are grouped into strongly connected components and inferred together. Each SCC is fully inferred before moving to the next.
- Deferred overload resolution: operators like
+are resolved after the SCC group is fully inferred, when more type information is available. - Salsa for incremental computation (query caching in the LSP).
How narrowing works
Narrowing uses first-class intersection types during inference (following the MLstruct approach from OOPSLA 2022). When isString x is the condition:
- Then-branch: x gets type
α ∧ string(an intersection of the original type variable with string) - Else-branch: x gets type
α ∧ ~string(intersection with negation)
These intersection types are structural — they flow through constraints, extrusion, and generalization like any other type. This means narrowing information survives let-polymorphism:
let f = x: if isNull x then 0 else x; in f
# f :: a -> int | ~null
# The ~null constraint on the else-branch's x is preserved
When a narrowed type like α ∧ ~null flows into a function that expects string, the solver applies variable isolation (the “annoying” constraint decomposition from MLstruct): α ∧ ~null <: string becomes α <: string | null, correctly constraining α without losing the negation information.
Negation normalization
Negation types are normalized during canonicalization using standard Boolean algebra rules:
- Double negation:
~~Tsimplifies toT - De Morgan (union):
~(A | B)becomes~A & ~B - De Morgan (intersection):
~(A & B)becomes~A | ~B - Contradiction:
T & ~Torstring & intin an intersection is detected as uninhabited and displayed asnever - Tautology:
T | ~Tin a union is detected as universal and simplifies toany(the top type) - Redundant negation:
{name: string} & ~nullsimplifies to{name: string}(attrsets are inherently non-null) - Union absorption:
{...} | {x: int, ...}simplifies to{...}— an open attrset with fewer required fields subsumes more specific open attrsets - Intersection factoring:
(A | C) & (B | C)simplifies toC | (A & B)— shared members across all union terms are factored out using the distributive law
LSP architecture
Event coalescing
Instead of per-file timer debouncing, the LSP uses an event-coalescing architecture inspired by rust-analyzer. didChange and didOpen notifications send events to a single analysis loop. The loop drains all pending events before starting analysis, naturally batching rapid edits without artificial delays. Diagnostic publication is deferred behind a 200ms quiescence timer to prevent flickering during rapid typing, but analysis results are available to interactive requests (hover, completion) immediately.
Completion works responsively during editing. When a completion request arrives before the latest analysis finishes, the server first tries full completion against the fresh parse tree. If that fails, it falls back to a syntax-only path that provides both dot completion (via name-text lookup against the stale analysis) and identifier completion (variable names from the scope chain).
Cancellation
When a new edit arrives for a file that’s currently being analyzed, the in-flight analysis is cancelled via a cooperative cancellation flag. The inference engine checks this flag between SCC groups and periodically during constraint propagation, so cancellation typically takes effect within milliseconds.
Key source files
| File | Role |
|---|---|
lang_ast/src/lib.rs | Module, Expr, AST arena types |
lang_ast/src/lower.rs | rnix CST → Tix AST lowering |
lang_ast/src/nameres.rs | Scope analysis, name resolution, SCC grouping |
lang_ast/src/narrow.rs | Guard recognition, NarrowPredicate enum |
lang_ty/src/lib.rs | Ty<R, VarType> and OutputTy type definitions |
comment_parser/src/tix_decl.pest | .tix file grammar |
lang_check/src/lib.rs | check_file entry point, InferenceResult |
lang_check/src/infer.rs | Orchestration, SCC iteration, extrude, generalization |
lang_check/src/infer_expr.rs | Single-pass AST inference walk |
lang_check/src/constrain.rs | Core subtyping constraint function |
lang_check/src/collect.rs | Canonicalization from bounds to OutputTy |
lang_check/src/storage.rs | Bounds-based type variable storage |
lang_check/src/builtins.rs | Nix builtin type synthesis |
lang_check/src/aliases.rs | TypeAliasRegistry (loads stubs, resolves aliases) |
lang_check/src/imports.rs | Import scanning, demand-driven cross-file resolution |
lang_check/src/coordinator.rs | Concurrent multi-file inference coordinator |
References
- Parreaux, The Simple Essence of Algebraic Subtyping (ICFP 2020) — the core type system
- Parreaux & Chau, MLstruct (OOPSLA 2022) — negation types and pattern matching
- Chau & Parreaux, Simple Essence of Boolean-Algebraic Subtyping (POPL 2026) — BAS reference implementation
- See
docs/internal/narrowing-design.mdfor full narrowing design rationale