Aleph-w 3.0
A C++ Library for Data Structures and Algorithms
Loading...
Searching...
No Matches
thread_pool_example.cc File Reference

Comprehensive ThreadPool Usage Examples: Parallel Task Execution. More...

#include <thread_pool.H>
#include <concurrency_utils.H>
#include <ah-errors.H>
#include <iostream>
#include <iomanip>
#include <vector>
#include <algorithm>
#include <functional>
#include <numeric>
#include <stdexcept>
#include <cmath>
#include <chrono>
#include <fstream>
#include <sstream>
Include dependency graph for thread_pool_example.cc:

Go to the source code of this file.

Classes

struct  FileResult
 

Functions

void print_header (const std::string &title)
 
bool is_prime (int n)
 
void example_basic_parallel ()
 
FileResult process_file (const std::string &filename)
 
void example_batch_processing ()
 
void example_fire_and_forget ()
 
void example_backpressure ()
 
void example_load_shedding ()
 
void example_structured_concurrency ()
 
void example_parallel_building_blocks ()
 
void example_channels_and_shared_state ()
 
void example_performance ()
 
int main ()
 

Detailed Description

Comprehensive ThreadPool Usage Examples: Parallel Task Execution.

This file demonstrates how to use the ThreadPool class effectively for parallel task execution. ThreadPool provides a high-level interface for managing worker threads and executing tasks concurrently, making it easier to parallelize computations.

What is a Thread Pool?

A thread pool is a collection of worker threads that execute tasks from a queue. Instead of creating threads for each task, threads are reused, reducing overhead and improving performance.

Benefits:

  • Reduced overhead: Reuse threads instead of creating/destroying
  • Resource control: Limit number of concurrent threads
  • Task queuing: Handle more tasks than threads
  • Load balancing: Distribute work across threads

Features Demonstrated

Example 1: Basic Parallel Computation

  • **enqueue()**: Submit task and get future
  • Futures: Wait for results asynchronously
  • Basic pattern: Most common usage

Example 2: Batch Processing

  • **enqueue_bulk()**: Process collections in parallel
  • Batch operations: Efficient parallel processing
  • Collection handling: Work with containers

Example 3: Fire-and-Forget Tasks

  • **enqueue_detached()**: Submit without waiting
  • Async operations: Don't need results
  • Background tasks: Long-running operations

Example 4: Backpressure Control

  • **enqueue_bounded()**: Limit queue size
  • Flow control: Prevent queue overflow
  • Backpressure: Handle overload gracefully

Example 5: Non-Blocking Submission

  • **try_enqueue()**: Submit without blocking
  • Load shedding: Reject when overloaded
  • Non-blocking: Don't wait if queue full

Example 6: Structured Tasks and Cooperative Cancellation

  • **TaskGroup**: launch related tasks and wait as a unit
  • **CancellationSource / CancellationToken**: cooperative stop requests
  • Structured concurrency: exceptions propagate from wait()

Example 7: Foundational Parallel Building Blocks

Example 8: Channels and Synchronized Shared State

  • **bounded_channel<T>**: bounded producer-consumer handoff with close
  • **synchronized<T>**: mutex-protected shared objects
  • **rw_synchronized<T>**: read/write-lock protected shared objects
  • **spsc_queue<T>**: bounded single-producer/single-consumer handoff

Example 9: Performance Comparison

  • Benchmarking: Compare parallel vs sequential
  • Speedup: Measure performance gains
  • Scalability: Test with different thread counts

Quick Start

// Create thread pool with 4 worker threads
ThreadPool pool(4);
// Submit a task and get a future
auto future = pool.enqueue([](int x) { return x * x; }, 5);
// Wait for result
int result = future.get(); // result = 25

When to Use ThreadPool

Good for:

  • CPU-intensive tasks
  • Independent computations
  • Batch processing
  • Parallel algorithms

Not good for:

  • I/O-bound tasks (use async I/O)
  • Very short tasks (overhead too high)
  • Tasks with dependencies (use task graphs)

Complexity

Operation Complexity Notes
enqueue() O(1) amortized Queue insertion
Future.get() O(1) Wait for completion
Thread creation O(1) Done at pool creation

Performance Considerations

  • Thread count: Usually CPU cores - 1 or CPU cores
  • Task granularity: Tasks should be substantial (avoid tiny tasks)
  • Overhead: ThreadPool has overhead, measure before optimizing
  • Cache effects: Consider data locality

Usage

./thread_pool_example

This example has no command-line options; it runs all examples.

Compilation

g++ -std=c++20 -O2 -pthread -I.. thread_pool_example.cc -o thread_pool_example

Or using CMake:

cmake --build . --target thread_pool_example
See also
thread_pool.H ThreadPool class implementation
ah_parallel_example.cc Parallel functional programming (uses ThreadPool)
Author
Leandro Rabindranath León

Definition in file thread_pool_example.cc.

Function Documentation

◆ example_backpressure()

◆ example_basic_parallel()

void example_basic_parallel ( )

◆ example_batch_processing()

◆ example_channels_and_shared_state()

void example_channels_and_shared_state ( )

Definition at line 623 of file thread_pool_example.cc.

References Aleph::count(), Aleph::divide_and_conquer_partition_dp(), and print_header().

Referenced by main().

◆ example_fire_and_forget()

void example_fire_and_forget ( )

◆ example_load_shedding()

◆ example_parallel_building_blocks()

◆ example_performance()

void example_performance ( )

◆ example_structured_concurrency()

void example_structured_concurrency ( )

◆ is_prime()

bool is_prime ( int  n)

Definition at line 180 of file thread_pool_example.cc.

Referenced by example_basic_parallel(), and example_parallel_filter().

◆ main()

◆ print_header()

void print_header ( const std::string &  title)

Definition at line 155 of file thread_pool_example.cc.

◆ process_file()

FileResult process_file ( const std::string &  filename)