Concurrency in Go: Goroutines & Channels
Concurrency is one of the defining features of Go. Unlike many programming languages where concurrency is complex and error-prone, Go provides a simple and powerful model built around goroutines and channels.
In this chapter, we will explore Go’s concurrency model and develop an intuition for designing concurrent programs.
This chapter represents one of the most important mindset shifts when transitioning from languages such as Python or Java.
1. Concurrency vs Parallelism
Before diving into Go-specific concepts, it is important to understand the distinction between concurrency and parallelism.
Concurrency
Concurrency refers to structuring a program so that multiple tasks can make progress independently.
Tasks may not necessarily execute at the same time. Instead, the system can switch between them efficiently.
Example:
- Handling multiple web requests
- Processing multiple files
- Running background tasks
Parallelism
Parallelism refers to multiple tasks executing simultaneously on multiple CPU cores.
Example:
- Performing mathematical calculations on large datasets
- Image processing
- Machine learning workloads
Key Insight
Concurrency is about designing programs for multiple tasks.
Parallelism is about executing tasks simultaneously.
Go’s concurrency primitives allow programs to scale naturally when more CPU cores are available.
2. The Go Concurrency Philosophy
Go’s concurrency model is inspired by Communicating Sequential Processes (CSP).
The core idea is summarized by a famous Go proverb:
“Do not communicate by sharing memory; share memory by communicating.”
Traditional multithreaded programming often relies on:
- Shared memory
- Locks
- Mutexes
- Synchronization primitives
These approaches can lead to:
- Deadlocks
- Race conditions
- Complex debugging
Go encourages a different approach:
- Independent execution units (goroutines)
- Communication through channels
This model significantly reduces complexity.
3. Goroutines
A goroutine is a lightweight thread managed by the Go runtime.
Creating a goroutine is extremely simple.
Example:
go functionName()
This tells the Go runtime to run the function concurrently.
Example
func sayHello() {
fmt.Println("Hello from goroutine")
}
func main() {
go sayHello()
fmt.Println("Main function")
}
In this example, the function sayHello runs concurrently with the main function.
Goroutines vs Operating System Threads
A common question is: how do goroutines differ from threads?
| Feature | Goroutines | OS Threads |
|---|---|---|
| Managed by | Go runtime | Operating system |
| Memory usage | Very small | Large |
| Startup cost | Extremely low | Expensive |
| Scalability | Hundreds of thousands | Thousands |
Because goroutines are lightweight, Go programs can easily run hundreds of thousands of concurrent tasks.
4. Goroutines vs Python Threads
For developers coming from Python, goroutines may feel conceptually similar to threads.
However, there are critical differences.
Python Threads
Python uses OS threads.
However, due to the Global Interpreter Lock (GIL):
- Only one thread executes Python bytecode at a time.
- CPU-bound parallelism is limited.
Threads are mostly useful for I/O-bound tasks.
Go Goroutines
Go does not have a GIL.
The Go scheduler efficiently distributes goroutines across available CPU cores.
This means Go programs can achieve true parallelism for CPU-bound workloads.
5. Synchronization Problems
Running multiple tasks simultaneously introduces several potential issues.
Race Conditions
A race condition occurs when multiple goroutines access shared data concurrently and the outcome depends on execution timing.
Example scenario:
Two goroutines increment the same variable.
Without synchronization, the result may be incorrect.
Deadlocks
A deadlock occurs when goroutines wait indefinitely for each other.
Example:
- Goroutine A waits for Goroutine B
- Goroutine B waits for Goroutine A
Neither proceeds.
These issues are common in traditional concurrent systems.
Go provides tools to mitigate these risks.
6. Channels
Channels are the primary way goroutines communicate.
Channels allow safe communication between goroutines.
A channel can be imagined as a typed pipe through which values can flow.
Creating a Channel
ch := make(chan int)
This creates a channel that can transmit integers.
Sending and Receiving Data
Send
ch <- 10
Receive
value := <-ch
Channels synchronize goroutines by default.
If no receiver is ready, the sender waits.
If no sender is ready, the receiver waits.
This built-in synchronization helps prevent race conditions.
7. Buffered vs Unbuffered Channels
Channels can be buffered or unbuffered.
Unbuffered Channels
ch := make(chan int)
Properties:
- Send blocks until receiver is ready
- Receiver blocks until sender sends
This creates strong synchronization.
Buffered Channels
ch := make(chan int, 5)
Properties:
- Can hold multiple values
- Sender does not block until buffer is full
Buffered channels are useful when:
- Producers generate data faster than consumers
- Tasks should not block immediately
8. The Select Statement
When working with multiple channels, Go provides the select statement.
select waits for multiple communication operations.
Example structure:
select {
case value := <-channel1:
// handle value
case channel2 <- data:
// send data
default:
// optional fallback
}
The first ready operation executes.
This is useful when:
- Coordinating multiple goroutines
- Implementing timeouts
- Handling multiple inputs
9. Concurrency Patterns
Experienced Go developers frequently use reusable concurrency patterns.
These patterns simplify the design of concurrent systems.
Fan-Out Pattern
The fan-out pattern distributes work across multiple goroutines.
Example scenario:
- Processing many files
- Fetching multiple API requests
- Parallel data processing
Workflow:
- Input tasks are generated.
- Multiple workers process tasks concurrently.
- Results are returned.
This significantly improves throughput.
Fan-In Pattern
Fan-in collects results from multiple goroutines into a single channel.
Example scenario:
- Multiple workers produce results
- Results are aggregated in one place
Fan-in allows systems to combine parallel results efficiently.
Worker Pool Pattern
A worker pool controls concurrency by limiting the number of workers.
This prevents:
- Resource exhaustion
- Too many goroutines
- Uncontrolled parallelism
Typical workflow:
- Job queue channel
- Fixed number of worker goroutines
- Workers process tasks from queue
Worker pools are widely used in:
- Job processing systems
- Message consumers
- API request processing
10. Testing Concurrent Programs
Testing concurrent code requires additional care.
The Go testing framework provides useful tools.
Writing Concurrency-Safe Tests
Tests should verify:
- Correct results
- No race conditions
- No deadlocks
Good practices include:
- Using channels for synchronization
- Avoiding sleep-based timing tests
Race Detector
Go includes a powerful built-in race detection tool.
Run tests with:
go test -race
The race detector identifies situations where multiple goroutines access shared data incorrectly.
This tool is extremely valuable in production-grade systems.
11. Common Concurrency Anti-Patterns
Learning concurrency also requires understanding common mistakes.
Creating Too Many Goroutines
Launching unlimited goroutines can overwhelm system resources.
Always control concurrency using:
- Worker pools
- Rate limiting
- Buffered channels
Sharing Memory Without Synchronization
Avoid direct modification of shared variables.
Prefer:
- Channels
- Message passing
- Immutable data
Goroutine Leaks
A goroutine leak occurs when a goroutine never terminates.
This can happen when:
- A channel is never read
- A goroutine waits indefinitely
Proper shutdown mechanisms are essential.
12. Concurrency in Real Systems
Concurrency is heavily used in modern systems such as:
- Web servers
- Data pipelines
- Distributed systems
- Microservices
- Stream processing
Go’s concurrency model is particularly powerful for:
- Cloud-native systems
- Kubernetes controllers
- Observability pipelines
- High-performance APIs
Many popular infrastructure tools written in Go rely heavily on goroutines and channels.
Examples include:
- Kubernetes
- Docker
- Prometheus
- Terraform
Understanding Go concurrency opens the door to contributing to these ecosystems.
13. What You Will Practice in This Chapter
In this chapter’s exercises, you will implement:
- Parallel data processing
- Fan-out / fan-in pipelines
- Worker pool systems
- Concurrency-safe testing
- Race detection
These exercises will reinforce the mental model needed to build robust concurrent Go applications.
Summary
This chapter introduced Go’s concurrency model, which is centered around goroutines and channels.
You learned:
- The difference between concurrency and parallelism
- How goroutines provide lightweight concurrent execution
- How channels enable safe communication between goroutines
- How to use select to coordinate multiple operations
- Common concurrency patterns used in production systems
- Techniques for testing concurrent code safely
Concurrency is one of Go’s most powerful features and mastering it will significantly expand the kinds of systems you can build.
In the next sections of this chapter, you will apply these concepts through practical exercises and assessments.