mirror of
https://github.com/graphdeco-inria/gaussian-splatting
synced 2024-11-22 00:08:02 +00:00
Advice for extensive data sets
This commit is contained in:
parent
5998454e5c
commit
06afb7e616
@ -388,6 +388,8 @@ Alternatively, you can use the optional parameters ```--colmap_executable``` and
|
||||
|
||||
- *24 GB of VRAM for reference quality training is still a lot! Can't we do it with less?* Yes, most likely. By our calculations it should be possible with **way** less memory (~8GB). If we can find the time we will try to achieve this. If some PyTorch veteran out there wants to tackle this, we look forward to your pull request!
|
||||
|
||||
- *How can I use this for a much larger dataset, like a city district?* Given enough memory, this should work out fine, but it will require to adapt the ```--scaling_lr``` and ```--position_lr_init/final```. To avoid manual tuning, a suggestion is to check the loss: if it diverges early on, reset and use a lower scaling / position learning rate.
|
||||
|
||||
- *How can I use the differentiable Gaussian rasterizer for my own project?* Easy, it is included in this repo as a submodule ```diff-gaussian-rasterization```. Feel free to check out and install the package. It's not really documented, but using it from the Python side is very straightforward (cf. ```gaussian_renderer/__init__.py```).
|
||||
|
||||
- *Something is broken, how did this happen?* We tried hard to provide a solid and comprehensible basis to make use of the paper's method. We have refactored the code quite a bit, but we have limited capacity to test all possible usage scenarios. Thus, if part of the website, the code or the performance is lacking, please create an issue. If we find the time, we will do our best to address it.
|
||||
|
Loading…
Reference in New Issue
Block a user