CS 134

Key Points: CSP, Asynchrony, and Parallelism

Communicating Sequential Processes (CSP)

  1. Core Principles:
    • Processes are independent and do not share state.
    • Communication between processes occurs through channels.
    • Synchronization is achieved through message passing, not locks.
  2. Advantages:
    • Simplifies reasoning about concurrent systems.
    • Reduces risks associated with shared-state concurrency.
  3. Implementation in Go:
    • Goroutines as lightweight threads.
    • Channels for communication between goroutines.
    • select statement for handling multiple channel operations.
  4. Semaphores via CSP:
    • Can be implemented using channels and goroutines in Go.
    • Demonstrates the power and flexibility of the CSP model.

Asynchronous Programming

  1. Definition: Setting an action in motion and moving on before it completes.
  2. Key Concepts:
    • Non-blocking operations.
    • Callbacks, Promises, or similar mechanisms to handle completion.
    • Event loops (e.g., in Node.js) for managing asynchronous operations.
  3. Usage:
    • I/O-bound operations (e.g., network requests, file operations).
    • Maintaining responsive user interfaces.
  4. Example in JavaScript:
    • Promises and async/await syntax.
    • Chaining asynchronous operations with .then().

Concurrency vs. Parallelism

  1. Concurrency:
    • Overlapping durations of actions.
    • About structure and dealing with multiple things at once.
    • Doesn't necessarily mean simultaneous execution.
  2. Parallelism:
    • About execution and doing multiple things at once.
    • Requires multiple execution engines (e.g., CPU cores).
    • Focused on achieving speedup.
  3. Simultaneity:
    • Literally doing multiple actions at the same time.
    • A subset of concurrent execution that requires parallelism.

Techniques for Parallelism

  1. Isolated Tasks:
    • Independent tasks that can run in parallel (e.g., compiling different source files).
    • Don't need to share data or have specific execution order.
  2. Work Queues:
    • Manage parallelism by holding tasks to be done.
    • Multiple worker threads/processes take tasks from the queue.
    • Examples: bmake for compilation; Go's goroutines with channels.
  3. Parallel Patterns:
    • Task parallelism: Different operations on possibly different data.
    • Data parallelism: Same operation on different parts of data.

Amdahl's Law

  1. Definition: Describes the theoretical speedup in latency of the execution of a task at fixed workload that can be expected of a system whose resources are improved.
  2. Key Insight: The speedup of a program using multiple processors in parallel computing is limited by the time needed for the sequential fraction of the program.

When to Use Which Approach

  • CSP: For systems with clear message-passing architectures, or when you want to avoid shared state.
  • Asynchrony: For I/O-bound operations, keeping user interfaces responsive.
  • Concurrency: For precise control over multiple threads/processes, shared resources needing synchronization.
  • Parallelism: For speeding up programs by running parts simultaneously on multiple cores.

Remember

  • CSP provides a powerful model for concurrent programming, emphasizing message passing over shared state.
  • Asynchronous programming is crucial for responsive applications, especially in I/O-bound scenarios.
  • Understand the distinctions between asynchrony, concurrency, simultaneity, and parallelism.
  • Parallelism is about achieving speedup, not just doing things at the same time.
  • Work queues are a powerful tool for managing parallel tasks.
  • Amdahl's Law shows that parallelism has limits—not all problems can be infinitely sped up through parallelization.
  • The choice of concurrency model depends on the specific requirements of your system and the nature of the problem you're solving.

(When logged in, completion status appears here.)