Key Points: CSP, Asynchrony, and Parallelism
Communicating Sequential Processes (CSP)
- Core Principles:
- Processes are independent and do not share state.
- Communication between processes occurs through channels.
- Synchronization is achieved through message passing, not locks.
- Advantages:
- Simplifies reasoning about concurrent systems.
- Reduces risks associated with shared-state concurrency.
- Implementation in Go:
- Goroutines as lightweight threads.
- Channels for communication between goroutines.
selectstatement for handling multiple channel operations.
- Semaphores via CSP:
- Can be implemented using channels and goroutines in Go.
- Demonstrates the power and flexibility of the CSP model.
Asynchronous Programming
- Definition: Setting an action in motion and moving on before it completes.
- Key Concepts:
- Non-blocking operations.
- Callbacks, Promises, or similar mechanisms to handle completion.
- Event loops (e.g., in Node.js) for managing asynchronous operations.
- Usage:
- I/O-bound operations (e.g., network requests, file operations).
- Maintaining responsive user interfaces.
- Example in JavaScript:
- Promises and
async/awaitsyntax. - Chaining asynchronous operations with
.then().
- Promises and
Concurrency vs. Parallelism
- Concurrency:
- Overlapping durations of actions.
- About structure and dealing with multiple things at once.
- Doesn't necessarily mean simultaneous execution.
- Parallelism:
- About execution and doing multiple things at once.
- Requires multiple execution engines (e.g., CPU cores).
- Focused on achieving speedup.
- Simultaneity:
- Literally doing multiple actions at the same time.
- A subset of concurrent execution that requires parallelism.
Techniques for Parallelism
- Isolated Tasks:
- Independent tasks that can run in parallel (e.g., compiling different source files).
- Don't need to share data or have specific execution order.
- Work Queues:
- Manage parallelism by holding tasks to be done.
- Multiple worker threads/processes take tasks from the queue.
- Examples:
bmakefor compilation; Go's goroutines with channels.
- Parallel Patterns:
- Task parallelism: Different operations on possibly different data.
- Data parallelism: Same operation on different parts of data.
Amdahl's Law
- Definition: Describes the theoretical speedup in latency of the execution of a task at fixed workload that can be expected of a system whose resources are improved.
- Key Insight: The speedup of a program using multiple processors in parallel computing is limited by the time needed for the sequential fraction of the program.
When to Use Which Approach
- CSP: For systems with clear message-passing architectures, or when you want to avoid shared state.
- Asynchrony: For I/O-bound operations, keeping user interfaces responsive.
- Concurrency: For precise control over multiple threads/processes, shared resources needing synchronization.
- Parallelism: For speeding up programs by running parts simultaneously on multiple cores.
Remember
- CSP provides a powerful model for concurrent programming, emphasizing message passing over shared state.
- Asynchronous programming is crucial for responsive applications, especially in I/O-bound scenarios.
- Understand the distinctions between asynchrony, concurrency, simultaneity, and parallelism.
- Parallelism is about achieving speedup, not just doing things at the same time.
- Work queues are a powerful tool for managing parallel tasks.
- Amdahl's Law shows that parallelism has limits—not all problems can be infinitely sped up through parallelization.
- The choice of concurrency model depends on the specific requirements of your system and the nature of the problem you're solving.
(When logged in, completion status appears here.)