Multiple graphic cards

edited May 2013 in MAIN
Hello,

After reading this topic you show me, I've a question :

You said :
"(...)
At
last, if you uses multiple graphic cards, keep in mind that rendering
can only be done by one graphic card, and the result must be pushed to
other graphic card, which is time consuming"

Can you be more specific ?
What are the restrictions with the use of two graphic cards ? (number of screens, size of the videos...)
What are the workarounds ??

Thanks again, sorry to be so curious...

Alain

Comments

  • Hello @Alain,

    As it is the case for other softwares, the rendering of every image must be done by only one graphic card : all informations must be copied in graphic card's memory, so the calculation could be made.
    This calculation cannot be done by 2 (or more) graphic cards simultaneously, as graphic card's memories cannot be shared.

    Here is the process (GPU means graphic card) :
    1. The GPU for rendering is choosen based on which monitor Millumin is run : if you monitor is connected to GPU-A, so Millumin will use this GPU-A for rendering every image.

    2. Once the rendering of one image is done, it is now stored in GPU-A memory.
      This result is pushed and displayed to monitor connected to GPU-A.

    3. If some monitors are connected to GPU-B, GPU-C, ... The result must be copied onto the memory of these GPU.
      This is a costly operation, that could reduce the performances.

    4. Once the result has been copied to other GPU, it is pushed and displayed on other monitors.
    As a general rule : best performances are achieved if all the monitor used by Millumin are connected to the same graphic card.
    If you use multiple graphic cards, the performances could be reduced a little, but everything will work perfectly. No restriction at all.



    Hope it clarifies everything. Philippe
Sign In or Register to comment.