Async TUIs using Bubble Tea
26 March 2026
Updated: 26 March 2026
Assumed audience: UI developers, people who program in Go, or anyone just generally interested in making computers to things using code
There was a little bug I ran into a while back but hadn’t been important enough for me to fix until yesterday when it started to slow me down
On Tri - a TUI app I built that is something like if tree was searchable and had previews likefzf - I (knowingly) didn’t implement async tasks upfront. At the time I was mostly focused on getting the implementation to a good level of UX and free of bugs. As such, there was a clear slowness when navigating the UI while running slow tasks, such as a git diff in a large repository
I sat down yesterday to make the app run previews run in the background - this turned out to be really easy and I thought I’d write about it just to mention why that was the case
Most TUIs I write in Go use the excellent suite of libraries by Charm - and in this particular case, the Bubble Tea TUI framework
Elm Architecture
The Bubble Tea framework is based on the Elm Architecture which is a functional style pattern for building UIs. I think understanding the Elm architecture is a generally useful and the documentation is worth a read for developers building any kind of user interface (even if it’s not in Elm)
The core idea is this:
- All UI flows from a Model
- Messages are used to perform Updates on the Model
- A View converts the Model into UI
This is also known as the MVU pattern (Model -> View -> Update)
Using this pattern, we can build a simple implementation of an app that has two bits of independent UI - a counter that increments when the user presses space, and a task runner that runs some heavy tasks triggered by pressing enter
A Sad Implementation
A naive implementation of this using Bubble Tea has the following bits that matter for discussion:
In the Update function, when we press space we increment the counter, and when we press enter we run some tasks:
1func (m model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {2 switch msg := msg.(type) {3
4 case tea.KeyPressMsg:5 switch msg.String() {6 case "ctrl+c", "q":7 return m, tea.Quit8
9 case "space":10 m.counter++11 return m, nil12
13 case "enter":14 m.running = true15 m.tasks = doTasks()16 return m, nil17 }18 }19
20 return m, nil21}This method then updates the model and returns the updated model - for context, the doTasks function looks like so:
1func doHeavyWork() {2 t := rand.IntN(5)3
4 // irl we'd do something other than sleep5 time.Sleep(time.Duration(t) * time.Second)6}7
8func doTasks() []task {9 var tasks []task10
11 for i := range 10 {12 doHeavyWork()13 tasks = append(tasks, task{i, true})14 }15
16 return tasks17}We can see this running below:

The problem with the above implementation is twofold:
- Bad performance - The UI is blocked while the tasks are running, so the counter does not update until after the tasks are run
- Sad UX - There isn’t a way to update an in-progress task, would be nice to not have to wait
You can see the full V1 implementation if you'd like
1package v12
3import (4 "fmt"5 "os"6 "time"7
8 tea "charm.land/bubbletea/v2"9 rand "math/rand/v2"10)11
12type task struct {13 index int14 done bool15}16
17type model struct {18 counter int19 running bool20 tasks []task21}22
23func (t task) string() string {24 status := "busy"25 if t.done {26 status = "done"27 }28
29 return fmt.Sprintf("Task %d [%s]", t.index, status)30}31
32func sleepRandomly() {33 t := rand.IntN(5)34 time.Sleep(time.Duration(t) * time.Second)35}36
37func doTasks() []task {38 var tasks []task39
40 for i := range 10 {41 sleepRandomly()42 tasks = append(tasks, task{i, true})43 }44
45 return tasks46}47
48func initialModel() model {49 return model{50 counter: 0,51 running: false,52 tasks: []task{},53 }54}55
56func (m model) Init() tea.Cmd {57 return nil58}59
60func (m model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {61 switch msg := msg.(type) {62
63 case tea.KeyPressMsg:64 switch msg.String() {65 case "ctrl+c", "q":66 return m, tea.Quit67
68 case "space":69 m.counter++70 return m, nil71
72 case "enter":73 m.running = true74 m.tasks = doTasks()75 return m, nil76 }77 }78
79 return m, nil80}81
82func (m model) View() tea.View {83 count := fmt.Sprintf("count = %d", m.counter)84 if !m.running {85 return tea.NewView(count + "\nPress space to increment counter\nPress enter to start tasks")86 }87
88 tasks := ""89 done := true90 for _, t := range m.tasks {91 if !t.done {92 done = false93 }94 tasks += "\n" + t.string()95 }96
97 title := "Running Tasks"98 if done {99 title = "All done"100 }101
102 return tea.NewView(count + "\n" + title + "\n" + tasks)103}104
105func Run() {106 p := tea.NewProgram(initialModel())107 if _, err := p.Run(); err != nil {108 fmt.Printf("Alas, there's been an error: %v", err)109 os.Exit(1)110 }111}A Happy Implementation
The solution that’s provided by Bubble Tea is to move the IO based work into a Command. A command is used to make things async and is handled by the framework
A command looks like so:
1// its type is tea.Cmd2var cmd tea.Cmd3
4// its value is a function that returns tea.Msg5cmd = func () tea.Msg {6 return SomeMessage{}7}So in order to make our work async, we simply need to return a tea.Cmd in our Update function instead of actually doing all the work
Instead of defining a function that does the work, we can define one that returns a tea.Cmd that will do the work:
1type taskDoneMsg struct {2 index int3}4
5func makeTasks() ([]task, []tea.Cmd) {6 var cmds []tea.Cmd7 var tasks []task8
9 for i := range 10 {10 tasks = append(tasks, task{i, false})11 cmds = append(cmds, func() tea.Msg {12 doHeavyWork()13 return taskDoneMsg{i}14 })15 }16
17 return tasks, cmds18}This will offload the work and we’ll receive a taskDoneMsg message when the work is done. This also has a nice side effect - by decoupling the creation of the task and the actual execution of it, we can now track the status of each task as it completes
We can do that in the Update function by handling the taskDoneMsg message as well as returning the []tea.Cmds that comes from the makeTasks function instead of actually doing the work upfront
1func (m model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {2 switch msg := msg.(type) {3
4 // handle updating the model when the task is done5 case taskDoneMsg:6 m.tasks[msg.index].done = true7 return m, nil8
9 case tea.KeyPressMsg:10 switch msg.String() {11 case "ctrl+c", "q":12 return m, tea.Quit13
14 case "space":15 m.counter++16 return m, nil17
18 case "enter":19 m.running = true20 tasks, cmds := makeTasks()21 m.tasks = tasks22
23 // batch the new cmds for bubbletea to handle24 return m, tea.Batch(cmds...)25 }26 }27
28 return m, nil29}And with that, we’ve now got a responsive UI that lets the counter work even while the tasks are running as well as makes it possible for us to track task state:

You can see the full V2 implementation if you'd like
1package v22
3import (4 "fmt"5 "os"6 "time"7
8 tea "charm.land/bubbletea/v2"9 rand "math/rand/v2"10)11
12type task struct {13 index int14 done bool15}16
17type taskDoneMsg struct {18 index int19}20
21func (t task) string() string {22 status := "busy"23 if t.done {24 status = "done"25 }26
27 return fmt.Sprintf("Task %d [%s]", t.index, status)28}29
30func doHeavyWork() {31 t := rand.IntN(5)32 time.Sleep(time.Duration(t) * time.Second)33}34
35func makeTasks() ([]task, []tea.Cmd) {36 var cmds []tea.Cmd37 var tasks []task38
39 for i := range 10 {40 tasks = append(tasks, task{i, false})41 cmds = append(cmds, func() tea.Msg {42 doHeavyWork()43 return taskDoneMsg{i}44 })45 }46
47 return tasks, cmds48}49
50type model struct {51 counter int52 running bool53 tasks []task54}55
56func initialModel() model {57 return model{58 counter: 0,59 running: false,60 tasks: []task{},61 }62}63
64func (m model) Init() tea.Cmd {65 return nil66}67
68func (m model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {69 switch msg := msg.(type) {70
71 case taskDoneMsg:72 m.tasks[msg.index].done = true73 return m, nil74
75 case tea.KeyPressMsg:76 switch msg.String() {77 case "ctrl+c", "q":78 return m, tea.Quit79
80 case "space":81 m.counter++82 return m, nil83
84 case "enter":85 m.running = true86 tasks, cmds := makeTasks()87 m.tasks = tasks88
89 return m, tea.Batch(cmds...)90 }91 }92
93 return m, nil94}95
96func (m model) View() tea.View {97 count := fmt.Sprintf("count = %d", m.counter)98 if !m.running {99 return tea.NewView(count + "\nPress space to increment counter\nPress enter to start tasks")100 }101
102 tasks := ""103 done := true104 for _, t := range m.tasks {105 if !t.done {106 done = false107 }108 tasks += "\n" + t.string()109 }110
111 title := "Running Tasks"112 if done {113 title = "All done"114 }115
116 return tea.NewView(count + "\n" + title + "\n" + tasks)117}118
119func Run() {120 p := tea.NewProgram(initialModel())121 if _, err := p.Run(); err != nil {122 fmt.Printf("Alas, there's been an error: %v", err)123 os.Exit(1)124 }125}Summary
That’s it - no Goroutines or channels needed, a pretty good abstraction on the side of the framework - and minimal effort needed from us
A small aside, I’ve started working on what will probably be a fairly sizeable side project. As such, I’ve gotten absolutely nothing done on that while somehow managing to put together two blog posts, update my photo galleries on my site, and fix a bunch of random things in other random side projects just this week - oh the power of procrastination