This project was generated with Angular CLI version 8.0.3.
Run ng serve
for a dev server. Navigate to http://localhost:4200/
. The app will automatically reload if you change any of the source files.
Run ng generate component component-name
to generate a new component. You can also use ng generate directive|pipe|service|class|guard|interface|enum|module
.
Run ng build
to build the project. The build artifacts will be stored in the dist/
directory. Use the --prod
flag for a production build.
Run ng test
to execute the unit tests via Karma.
Run ng e2e
to execute the end-to-end tests via Protractor.
To get more help on the Angular CLI use ng help
or go check out the Angular CLI README.
https://download.docker.com/win/stable/Docker%20for%20Windows%20Installer.exe
This project was generated with Angular CLI version 8.0.3.
Run ng serve
for a dev server. Navigate to http://localhost:4200/
. The app will automatically reload if you change any of the source files.
Run ng generate component component-name
to generate a new component. You can also use ng generate directive|pipe|service|class|guard|interface|enum|module
.
Run ng build
to build the project. The build artifacts will be stored in the dist/
directory. Use the --prod
flag for a production build.
Run ng test
to execute the unit tests via Karma.
Run ng e2e
to execute the end-to-end tests via Protractor.
To get more help on the Angular CLI use ng help
or go check out the Angular CLI README.
Create a docker-compose.yml
file and edit it.
Change your directory to where your docker-compose.yml
file is saved.
Then run the command docker-compose up
.
And you will see the service coming up if you go to Elastic Search 'http://localhost:9200/'.
Kibana is used to search, view, and interact with data stored in Elasticsearch indices.
Run docker pull docker.elastic.co/kibana/kibana:6.8.2
Run docker container ls
Run docker network ls
Run docker run --link 411396fbb6a6:elasticsearch --link 821d77d9bfc8:elasticsearch2 --net elastic682docker_esnet -p 5601:5601 kibana:6.8.2
(Here --net elastic682docker_esnet
is the network's name on which Elasticsearch is running.)
(Here --link 411396fbb6a6:elasticsearch
is the containerid:containername.)
And you will see the service coming up if you go to Kibana 'http://localhost:5601/'.
FScrawler helps to index binary documents such as PDF, Open Office, MS Office.
Youtube link i.e, for beter understanding.
##Other Useful Links:
##2. Setup environment variable JAVA_HOME = C:\Progra~1\Java\jdk1.8.0_221
JRE_HOME= C:\Progra~1\Java\jre1.8.0_221
##3. Run:
bin\fscrawler --config_dir .\BWTLFK plantdrawings
(i)BWTLFK is a folder name and plantdrawings is a sub-folder which contains the setting.json file)
(ii)We can create more subfolders in a folder to setup different drives like :Z, :Y etc.)
(iii)For eg: bin\fscrawler --config_dir .\BWTLFK scanneddocs)
in the command terminal, it will ask to create a new job so type Y
for yes.
##4. Edit Setup.json file (Check the sample setup.json file)
##5. re-run: bin\fscrawler --config_dir .\BWTLFK plantdrawings
##6. Restart: If your job was interrupted or stopped, you don't have to worry about reindexing everything.
run: bin\fscrawler --config_dir .\BWTLFK job_name --restart
(This command indexes from where it was last stopped. In other word, it continues to index remaining unindexed documents.)
#TESSERECT (OCR)
##For reading ocr from pdfs, Download Tesserect
##Set directiories to environmental variable to auto detect globally.