HTML in WebGPU Shaders in Canvas

12 April 2026

Updated: 12 April 2026

I made some minor updates to my Shader Web Component to add support for the proposed HTML-in-Canvas API. So you can see the following in a Chromium based browser with the flag chrome://flags/#canvas-draw-element enabled

Hello from HTML!

How Does It Work?

The HTML in the above example is rendered within the canvas and is - if/when supported - exposed to the accessibility tree

As per the documentation, this makes use of the following:

  1. The layoutsubtree attribute on the canvas element to tell the browser that we want to render the contents
  2. The copyElementImageToTexture function on the WebGPU GPUDevice.queue to copy the texture
  3. Use the canvas.onpaint handler to run the copyElementImageToTexture function with the contents

Usability Issues

The proposal is still just that - a proposal - so it has some issues and is still very experimental.

The above example should show the inner content if supported. I’ve noticed that there are some quirks around how the content focus, etc. work since the click targets aren’t aligned as expected, etc. which creates some weird usability issues

Some challenges I see here are around keeping positions in sync so that interactive elements remain correctly interactive and that basic parts of the web continue to work as users expect

The example above originally suffered from the fact that the size of the canvas content doesn’t match the canvas and so things are not quite in the right place. Matt Rothenberg also points this out in their HTML in Canvas post and while it’s definitely doable it does require a fair amount of wrangling to make work nicely

I’ve solved the resizing issue in the above example using CSS but it’s also possible to do using the drawElementImage function as per the API to do the necessary alignment

Technical Challenges

I also ran into some VERY annoying bugs (or is this just how WebGPU is meant to work - I don’t know) along the way, the main one was to do with figuring out how to use the GPU bindings

In particular, I saw the following error about a million times:

1
In entries[0], binding index 0 not present in the bind group layout.
2
Expected layout: []
3
- While validating [BindGroupDescriptor ""textures""] against [BindGroupLayout (unlabeled)]
4
- While calling [Device].CreateBindGroup([BindGroupDescriptor ""textures""]).

This was because the bindings in my shader related to the textures were not used:

1
// unused
2
@group(0) @binding(0) var htmlSampler: sampler;
3
@group(0) @binding(1) var htmlTexture: texture_2d<f32>;
4
5
// used
6
@group(1) @binding(0) var<uniform> uTime: f32;

The respective texture and sampler were not used, and so the auto layout for the shader doesn’t seem to detect that they exist. I find this very confusing - the error here could be better

There’s a good example on explicitly defining the layouts on this WebGPU Bind Group Layouts article and a handy calculator for providing the bind group layout config. This ended up eventually being how I realized the bind group was being left out since for some reason the entries in @group(0) were being left out even though their example - when updated to use “my” bindings for the texture and sampler - worked just fine.

Anyways, ensuring that the sampler and texture are used meant that everything worked fine again. So conditionally adding the bindgroup works which adds some ugliness to my shader rendering code but I think it’s fine for what it makes possible

The snippet for the above example can be seen below:

1
<site-shader-canvas>
2
<canvas width="1000" height="1000" style="width: 100%" layoutsubtree>
3
<div class="html-in-canvas" style="width: 100%; height: 100%; color: green; display: flex; flex-direction: column; align-items: center; justify-content: center; gap: 8px;">
4
<h2 style="margin: 0;">Hello from HTML!</h2>
5
6
<label for="input-in-canvas">Input in canvas</label>
7
<input style="display: block;" id="input-in-canvas" />
8
</div>
9
</canvas>
10
11
<script type="text/wgsl">
12
struct VertexOutput {
13
@builtin(position) position: vec4f,
14
@location(0) texcoord: vec2f,
15
};
16
17
@vertex fn vs(
18
@builtin(vertex_index) vertexIndex : u32
19
) -> VertexOutput {
20
const pos = array(
21
vec2( 1.0, 1.0),
22
vec2( 1.0, -1.0),
23
vec2(-1.0, -1.0),
24
vec2( 1.0, 1.0),
25
vec2(-1.0, -1.0),
26
vec2(-1.0, 1.0),
27
);
28
29
var vsOutput: VertexOutput;
30
31
let xy = pos[vertexIndex];
32
vsOutput.texcoord = pos[vertexIndex] * vec2f(0.5, 0.5) + vec2f(0.5);
33
vsOutput.position = vec4f(pos[vertexIndex], 0, 1);
34
35
return vsOutput;
36
}
37
38
@group(0) @binding(0) var htmlSampler: sampler;
39
@group(0) @binding(1) var htmlTexture: texture_2d<f32>;
40
41
@group(1) @binding(0) var<uniform> uTime: f32;
42
43
fn flipSample(fsInput: VertexOutput) -> vec4f {
44
// the texture must be flipped since the shader coord system is the
45
// opposite of the HTML one
46
var pos = vec2f(fsInput.texcoord.x, 1 - fsInput.texcoord.y);
47
return textureSample(htmlTexture, htmlSampler, pos);
48
}
49
50
@fragment fn fs(fsInput: VertexOutput) -> @location(0) vec4f {
51
var red = abs(sin(uTime/10.0)) * fsInput.texcoord.x;
52
var blue = abs(cos(uTime/5.0)) * fsInput.texcoord.y;
53
54
return vec4f(red, 0.0, blue, 1.0) + flipSample(fsInput);
55
}
56
</script>
57
</site-shader-canvas>

You can also take a look at the source of the web component and respective shader renderer if you’re interested