I ran out of GPU memory...

Well, this is a turn that I din´t expect :)
My GPU memory went through the roof and now it refuses to render my scene. Could it be the geometry nodes or the trees? Suppose the trees could be a problem since it has a lot of polygons. I guess the volume scattering made it worse now that I think about it.
I tested to turn on and of the trees and it did lower the GPU memory usage. As did the grass. Should I perhaps model the trees with lower polygon count?
But could you please advice what to do?
For the record, I have a DELL Precision laptop with a 11th Gen Intel(R) Core(TM) i7-11850H @ 2.50GHz, 32Gbt memory and a NIVIDIA RTX A3000 Laptop GPU with 6Gbt memory. I guess the card memory is a bit low? 

Thanks for a supperb tutorial!
  • Martin Bergwerf replied

    Hi Ulf ssupertuffe ,

    You could render with your CPU (like Kenan does in the Tutorial); with 296 Samples, that shouldn't take forever.

    When you render 1 Frame, you can have a look at the peak Memory usage and see if it is a lot more than the 6Gb...maybe it is worth trying to optimize the scene, if you really want to use the GPU.

    You can save Memory almost anywhere. Geometry Nodes wouldn't be the first place I would look though. I still have to watch the Course, to see what the 'best' ways are to save Memory, but I'd first try the CPU.

    1 love
  • Ulf Nilsson(supertuffe) replied

    Hello Martin!

    Thank you for your comment! Yes I have swithed from GPU and I am currently trying to render with CPU, but Oh My God its slow :)

    But, it works! So a little patience and it will get through. But I will also try this at home with my home computer. It has 8Gbt of VRAM so it should at least be enough for render. I will try to adjust the tree´s dense geometry, and make it less dense. I think the subdivision is quite high, so I will try to lower the resolution.

    Because when I add the trees, the RAM usage goes up. But as you said, there are certainly many more tweeks I can make.

  • Omar Domenech replied

    Rob got the out of memory error the other day and I wrote some of the options he could take. I'm going to copy paste it here to see if one of those approaches can work for you:

    "You can start by turning off subdiv to objects that are for away on your scene and don't really need that much geometry. See if there's objects that can get away with 1 level of subdiv and still look good o have no subdiv at all. That will save you on vertex count. Next up try linking object data on objects that are alike. For example bolts on a machinery, you shouldn't really be Shift + D and duplicating them when you can save up on resources and ALT + D to have linked duplicates. If they are already their own duplicate, you can Control + L and link their object data. Next up try getting your 4K textures to 2K, if you got any. Next up see if you can have your assets on a separate files and link them into one master file where you do the layout of everything. Linking your geometry saves up enormously on memory. And last but not least, you can render your scene by chunks and put it together in compositing. That is what used to be the old render layers. And then there's always the dream of getting an awesome super expensive video card that will make your life easier until you start doing come complex scenes again and also make your expensive card run out of memory."

    1 love
  • Kay Meadows(KRMeadows) replied

    Because no one mentioned it: the Decimate modifier can help optimize vertex count on high poly meshes.

    • 🤘
    • 👍🏻