As part of the logging system in a big project I’m working for a long time ago we are using Graylog to storage the logs coming from the application as well as from devices and services.
As you may know Graylog uses ElasticSearch to store the data (logs) and MongoDB to store some metadata and configuration.
For these days we are facing the need to grow our stack and make our logging system more scalable, and also, with the release of Graylog 1.0.1 we decided to build a new and more powerful logging system. Said that, and keeping on mind that we have to maintain the already existing logs, we faced the problem to migrate those existing logs (data) into the new ElasticSeach instance.
After some research on existing tools to do that, we found Elasticsearch-Exporter.
To install Elasticsearch-Exporter:
user@unix:~$ npm install nomnom
user@unix:~$ npm install colors
user@unix:~$ npm install elasticsearch-exporter --production
Then to copy all indices from server A to server B:
user@unix:~$ node exporter.js -a serverA -b serverB
After letting it run for a while, we start seeing messages like:
Processed 4144396 of 10626656 entries (39%)
And that’s. After the process finished we have the existing data in the new Server B. Enjoy it! 😀
For these days I had the possibility to work with a great tool in my job. This tool allows us to monitor and control a number of processes on Unix operating systems.
As their website says: “Supervisor is a client/server system that allows its users to monitor and control a number of processes on UNIX-like operating systems.“.
Supervisor is written in Python and based on a configuration file (supervisord.conf) the daemon (supervisord) keeps monitoring for defined processes are running, and if the process is killed by whatever cause the supervisord start it again.
A configuration file looks like:
command=/usr/bin/php email_sender.php --arg1=a --arg2=b
And a web app shows us a status of processes:
For more information you can read the documentation.