I set the SHM size to 48G on my witness server (with 50GB RAM) and the server stopped to produce blocks with the following message:
2736622ms th_a database.cpp:2596 show_free_memory ] Free memory is now 50M. Increase shared file size immediately
I had to stop the node, increase the shared file size, mounted more spaced to /dev/shm. Meanwhile, I thought it would be a good idea to upgrade node from 0.19.5 to 0.19.6 using command ./run.sh install.
Also, I enabled zram which is to compress the RAM using LZO compression algorithm.
du -h /dev/shm/shared_memory.bin
49G /dev/shm/shared_memory.bin
du -h --apparent-size /dev/shm/shared_memory.bin
96G /dev/shm/shared_memory.bin
and the uncompressed size – which is the SHM size is set to 96GB – usually this number is suggested by the following:
SIZE = 2 * RAM - SWAP
The ZRAM let your node last a bit longer with the limited RAM size – however at the cost of speed because it takes efforts for processors to zip and unzip the RAM on the fly i.e. it takes a bit longer this time (around 40 hours reindexing time).
For steem re-indexing, the time required can be shortened if the CPU frequency is higher. Anyway, the node has produced two blocks since last re-enabled – everything seems good so far!
Being a steem witness is not easy, it takes efforts to maintain your server regularly e.g. applying updates/hardforks – this requires that you monitor (that is your duty) the healthy of your witness node from time to time (I check that every day – every few hours when I am awake – on my phone)
The latency is good and the witness is running smoothly!
Support me and my work as a witness – Witness Thread by
- voting me here, or
- voting me as a witness proxy – let @justyy represent you.
Thank you! Some of My Contributions: SteemIt Tutorials, Robots, Tools and APIs
–EOF (The Ultimate Computing & Technology Blog) —
Last Post: SteemTools - the Most Delegated Accounts on the Steem Blockchain
Next Post: How to Convert Sorted Array to Balanced Binary Search Tree?