Migrate ElasticSearch data from one server to another one

As part of the logging system in a big project I’m working for a long time ago we are using Graylog to storage the logs coming from the application as well as from devices and services.

As you may know Graylog uses ElasticSearch to store the data (logs) and MongoDB to store some metadata and configuration.

For these days we are facing the need to grow our stack and make our logging system more scalable, and also, with the release of Graylog 1.0.1 we decided to build a new and more powerful logging system. Said that, and keeping on mind that we have to maintain the already existing logs, we faced the problem to migrate those existing logs (data) into the new ElasticSeach instance.

After some research on existing tools to do that, we found Elasticsearch-Exporter.

To install Elasticsearch-Exporter:

1
2
3
user@unix:~$ npm install nomnom
user@unix:~$ npm install colors
user@unix:~$ npm install elasticsearch-exporter --production

Then to copy all indices from server A to server B:

1
user@unix:~$ node exporter.js -a serverA -b serverB

After letting it run for a while, we start seeing messages like:

1
Processed 4144396 of 10626656 entries (39%)

And that’s. After the process finished we have the existing data in the new Server B. Enjoy it! 😀

share it...Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestDigg thisEmail this to someone

2 Comments

  1. Keep getting this back when running the command: node exporter.js -a serverA -b serverB

    Definitely have indices and documents to copy over though…

    Elasticsearch Exporter – Version 1.4.0
    Reading source statistics from ElasticSearch
    The source driver has not reported any documents that can be exported. Not exporting.
    Number of calls: 0
    Fetched Entries: 0 documents
    Processed Entries: 0 documents
    Source DB Size: 0 documents

    Reply

Leave a Reply to Ben Cancel reply