use_cuda_context¶
inference_models.models.common.cuda.use_cuda_context
¶
Context manager for using an existing CUDA context.
Pushes a CUDA context onto the context stack, making it active for the duration of the context manager, then pops it when exiting. This ensures proper context management for CUDA operations.
Parameters:
-
(context¶Context) –PyCUDA Context object to activate.
Yields:
-
Context–cuda.Context: The active CUDA context.
Examples:
Use an existing CUDA context:
>>> from inference_models.developer_tools import use_cuda_context
>>> import pycuda.driver as cuda
>>>
>>> cuda.init()
>>> device = cuda.Device(0)
>>> context = device.retain_primary_context()
>>>
>>> with use_cuda_context(context) as ctx:
... # Perform CUDA operations
... pass
Note
- Requires PyCUDA to be installed
- Automatically pushes context on entry and pops on exit
- Context is popped even if an exception occurs
See Also
use_primary_cuda_context(): Convenience wrapper for primary context