1、Installation of Elasticsearch database
PS: Before that, you need to install the Java SE environment first!
Download elasticsearch-6.5.2, go to the /elasticsearch-6.5.2/bin directory, double-click to execute Open your browser and type inhttp://localhost:9200 The installation is successful if the following is displayed
Install head plugin for easy view management (also use kibana)
First install Nodejs (download at/en/)
re-downloadelasticsearch-head-master
The package is extracted under /elasticsearch-6.5.2/ (link./s/1q3kokFhpuJ2Q3otPgu7ldg
Extract code: 1rpp
Modify the configuration file elasticsearch-6.5.2\config\ as follows:
Go to the elasticsearch-head-master directory and run npm install -g grunt-cli, then npm install to install the dependencies.
existelasticsearch-head-master
Find the file in the directory to modify the server listening address as follows:
Execute the grunt server command to start the head service.
access addresshttp://localhost:9100/ Access to the HEAD administration page
2, will write the json file into the ES database (py script as follows)
# -*- coding: UTF-8 -*- from itertools import islice import json , sys from elasticsearch import Elasticsearch , helpers import threading _index = 'indextest' #Modify to index name _type = 'string' #Modify to type name es_url = 'http://192.168.116.1:9200/' #Modify to elasticsearch server reload(sys) ('utf-8') es = Elasticsearch(es_url) (index=_index, ignore=400) chunk_len = 10 num = 0 def bulk_es(chunk_data): bulks=[] try: for i in xrange(chunk_len): ({ "_index": _index, "_type": _type, "_source": chunk_data[i] }) (es, bulks) except: pass with open([1]) as f: while True: lines = list(islice(f, chunk_len)) num =num +chunk_len ('\r' + 'num:'+'%d' % num) () bulk_es(lines) if not lines: print "\n" print "task has finished" break
summarize
The above is a small introduction to Python to write json files into the ES database method, I hope to help you, if you have any questions please leave me a message, I will reply to you in a timely manner. Here also thank you very much for your support of my website!
If you find this article helpful, please feel free to reprint it, and please note the source, thank you!