-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ignore already generated chunks, to reduce resources used? (1.10.2/1.5) #40
Comments
That's odd, because I just ran a 400 radius gen last night on a server with 3GB allocated and it went just fine, memory use stayed constant the whole time. And I'm pretty sure the way the logic works is it calls for chunks to be loaded, and if they don't exist the server creates them. But if they do exist, loading is pretty much instant, and then they get unloaded. I definitely noticed that when I first started the chunkgen task, it blew through the first bunch of chunks really quickly because they already existed, then slowed down a bit when it had to start generating new ones. Anyway, finished chunks should definitely be getting unloaded as it goes, so memory usage shouldn't keep going up. What modpack are you using? These days though, if you're running Java 8 and Minecraft 1.10, I'd suggest switching to the G1 GC. Yes, I had VisualVM open and was watching memory activity while Chunkgen was running, so here's some guidelines from what I've found. The static (Old Gen) usage may vary by modpack, but most med-size packs these days will be roughly the same. I'm using FTB Infinity Lite 1.6.0 on MC 1.10.2, Java 8_102, FreeBSD, 4GB RAM server, 3GB given to Java.
Again, Aikar's post walks through what most of those actually do, my point was I've tested that they actually behave correctly for this specific situation. Old Gen stays nearly constant at 1.15GB used / 1.74GB allocated, Eden (New) space allocates roughly 1.2GB, and fills at around 60MB/s, with cleanups of 50ms every 15s or so: The point is you want to make sure that all those chunks being generated, loaded and unloaded stay in the Eden space so they can be cleaned up quickly, not filling up the Old gen where it takes much longer to empty and leads to Out of Memory situations. If you've got more than 4GB of memory to allocate to Minecraft (And if it's a desktop computer, not a dedicated server, don't throw more than half of your system's total RAM at it), try setting And that was far more than I planned on writing, but I have dealt a lot with optimizing Java's GC for running large modpacks on small servers, and handling worldgen without lagging as much. (Mods that add things to worldgen are a completely different issue, you can still get lag with players online and exploring new chunks, as then you're CPU-bound, not out of memory. But that's why we're pre-generating the world) TL;DR: |
I was probably using really old launch commands. Plus, I'm going through SkyFactory 3 right now, which doesn't require stupid amounts of RAM. When/if I restore my old world, I'll definitely use the new commands (I'm also going to restart my server using these new commands). However, my original question was if logic existed to skip already generated chunks, as opposed to going through them (not regenerating of course) from 0. |
I don't know if there's a way to automatically know what chunks exist or not, but there is actually a config option to skip X chunks at the start. You just have to figure out how many need skipping yourself. There's also a config option for max chunks to have loaded that you might want to play with. Go check out chunkgen.cfg.
|
I am also not able to generate the desired map for my server. I usually do it every night to avoid causing lag to my users, but it is too slow and I always load the already generated chunks, taking longer than expected, even in the whole night barely gets to generate 0.5% |
I don't have the greatest machine to use as a standalone server, let alone, be able to handle just 2 players (just myself and my brother). The worldgen always crashed my server, so chunkgen is a God-send.
However, I realized that regardless where I start the generation, it always goes over already generated chunks. I can't put in 100 100 because it'll get about 50% and throw an out of memory error (6GB allocated with a C2Q Q8400). I was able to successfully generate a 75x75 area, which looks plenty, yet my endgame is 500x500.
I'm constantly hitting my cap, and it really doesn't matter which jvm arguments I use, it always caps out the RAM.
Would it be possible to add a logic to parse the region data really quick to see what has been generated, then go from there?
Thank you.
The text was updated successfully, but these errors were encountered: