Skip to content

out of memory - does not support bf16 - Tesla P40 24gb Vram #128

@gandolfi974

Description

@gandolfi974

hi, i have tried to generate 5s of video with 768p and 12 frames/s with my P40 (24gb Vram) and gradio interface on windows. But i have this message error. i have activate cpu_offloading and i have modified BF16 to f32 to avoid error.

[ERROR] Error during text-to-video generation: CUDA out of memory. Tried to allocate 10.90 GiB. GPU 0 has a total capacty of 23.90 GiB of which 183.25 MiB is free. Of the allocated memory 9.91 GiB is allocated by PyTorch, and 13.60 GiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

  • is it normal ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions