I stumpled upon the INSERT INTO LOGSTASH SELECT DATA FROM DATABASE article and decided to play around.
Installing PostgreSQL and loading the database and the data into Elasticsearch was just as described; the manual for MBSlave is very good.
I decided to use elkserver3 and a new logstash on that one, to avoid messing elkserver1 up.
Is possible to have a different Logstash config on each server in the cluster. Also - I decided to name the Index "musicbrainz-%{+YYYY.MM.dd}" to have some control over the load and a possible cleanup afterwards.
My Input file:
input { jdbc { jdbc_driver_library => "/etc/logstash/postgresql-9.4.1212.jre6.jar" jdbc_driver_class => "org.postgresql.Driver" jdbc_connection_string => "jdbc:postgresql://localhost:5432/musicbrainz?user=musicbrainz&password=*******" jdbc_user => "musicbrainz" statement_filepath => "/etc/logstash/query.sql" schedule => "0 15 * * *" } }
My Output file:
output { elasticsearch { hosts => "localhost:9200" sniffing => false manage_template => false index => "musicbrainz-%{+YYYY.MM.dd}" } }