PixiJS renderers draw your scene to a canvas using WebGL/WebGL2, WebGPU, or the Canvas 2D API. They're GPU-accelerated engines composed of modular systems that manage texture uploads, rendering pipelines, and more.
All renderers inherit from a common base, providing consistent methods like .render(), .resize(), and .clear(), along with shared systems for canvas management, texture GC, and events.
| Renderer | Description | Status |
|---|---|---|
WebGLRenderer |
Default renderer using WebGL/WebGL2. Stable and widely supported. | Recommended |
WebGPURenderer |
Uses the WebGPU API. Faster in many cases, still maturing. | Experimental |
CanvasRenderer |
Fallback renderer using the HTML Canvas 2D context. | Experimental |
The WebGPU renderer is feature-complete, but inconsistencies in browser implementations may cause unexpected behavior. Use the WebGL renderer for production applications.
Use autoDetectRenderer() to pick the best renderer for the current environment:
import { autoDetectRenderer } from 'pixi.js';
const renderer = await autoDetectRenderer({
preference: 'webgpu', // or 'webgl' or 'canvas'
});
Or construct one directly when you need a specific renderer type (e.g., for testing or when you know the target environment):
import { WebGLRenderer } from 'pixi.js';
const renderer = new WebGLRenderer();
await renderer.init(options);
Most applications should use autoDetectRenderer() and let PixiJS pick the best backend. Use direct construction only when you have a specific reason.
Call render() with a Container to draw it to the screen:
import { Container } from 'pixi.js';
const container = new Container();
renderer.render(container);
You can also pass an options object for more control:
import { Matrix } from 'pixi.js';
renderer.render({
container: myContainer,
clear: true,
transform: new Matrix(),
});
The container property is the scene root to draw. target is a separate property that specifies a render destination (e.g., a RenderTexture).
When rendering to a texture-backed target, you can specify mipLevel to render into a specific mip level of the target's underlying texture storage. Most applications won't need this; it's useful for custom LOD (level of detail) systems or manual mipmap generation.
import { RenderTexture } from 'pixi.js';
const rt = RenderTexture.create({
width: 256,
height: 256,
mipLevelCount: 4,
autoGenerateMipmaps: false,
});
// Render into mip 1 (128x128)
renderer.render({
container,
target: rt,
mipLevel: 1,
});
If your target is a Texture with a frame (e.g. an atlas sub-texture), that frame is interpreted in mip 0 pixel space and is scaled/clamped when rendering to mipLevel > 0.
renderer.resize(window.innerWidth, window.innerHeight);
Create textures from any display object with generateTexture():
import { Sprite } from 'pixi.js';
const sprite = new Sprite();
const texture = renderer.generateTexture(sprite);
When mixing PixiJS with other WebGL/WebGPU libraries (e.g., Three.js), each library may leave GPU state (bound textures, blend modes, active shaders) that conflicts with the other. Call resetState() before each library renders to avoid visual glitches, missing objects, or incorrect blending:
function render() {
threeRenderer.resetState();
threeRenderer.render(scene, camera);
pixiRenderer.resetState();
pixiRenderer.render({ container: stage });
requestAnimationFrame(render);
}
requestAnimationFrame(render);