The goal of this project is to provide the full stack of a website where (a hopefully novel) method of technical analysis can be done on a securities orderbook, as well as accumulate hisorical financial data and provide traditional technical analysis tools. Currently the project is being developed to use crypto currencies as the securities because we require a level 2/3 financial data stream and those tend to be prohibitively expensive for "traditonal" financial markets ( NYSE , NASDAQ etc)
The main idea or goal of the project is to provide vizualization and tehcnical analysis tools to be done on the orderbook of a security, beyond just the depth of market. Traditional technical analysis (think tradingview) is done on previous trades (candlestick charts) however the motivation behind this project is that additional information such as "sentiment" can be seen in a secutity's orderbook and how it changes over time. The price levels at which orders are placed and whether those orders fill or get cancelled can privide insight into the "sentiment" of traders. One of many methodologies used in high frequency algorithmic trading uses a securities depth of market and how it is changing to make a decision.
The idea arose after watching a very interesting video about the 2010 flash crash and how it was perpetrated. The video skims over and gives you an idea of some of the methods employed in high frequency trading algorithms and how the trader exploited those methods (using spoofing). Although there is no attempt at spoofing here, as it is illegal it does bear the question of whether there are conclusions that can be drawn from the orderbook and it's state relative to it's state in the past at a different price point.
Integration of submodule repositories which together create the full stack. It is mainly responsible for creating/restoring databases used in the deployment as well as creating the correct environment for the different elements/submodules to work together correctly.
NOTE: This integration at its current state has not yet been fully tested as this project as a whole is still under heavy active development and different components change it may break. I do my best to keep it up to date though.
- Set necessary docker secrets in
/steup/secrets
, most are sefl-explanatory- mongo_root_user.txt
- mongo_root_pass.txt
- mongo_worker.txt:
- mongo_worker_password.txt:
- mysql_root_pass.txt:
- mysql-user.txt:
- mysql-user-password.txt
if you dont know how secrets work: look under
secrets
section of docker-compose, each files content becomes a variable you use elsewhere. eg. place mongoDB admin password in the file./setup/secrets/mongo-root-password.txt
and that is what will be used docker-compose up -d
This repo/inegration is setup in such a way that running for the first time will automatically create the necessary environment and databases (with some test data) to run all submodules which consistue the actual applocation. If instead you want to restore databases that already contain development/production information in them, the setup scripts will take care of that as well you just need to make them available.
Delete everything in the directory being mounted to hold the database (or the volume if you change docker-compose.yml
to use a volume instead)
Mount database dump(s) into /restore
directory of either the mongoDB or mysql image and setup scripts it will restore that database automatically.
-
-
restore: mount database (created using mongodump) you want restored into
/restore
( already included indocker-compose.yml
).-
If you have a list of database users and permissions that you want restored/applied to databases mount the file to
/restore/restore-users
. This file requires the following format to be applied correctly.username:password:database:db permissions (readWrite, read, write, etc.)
ex:
admin:password:orderbook:readWrite
NOTE: do not include empty line(s at the end), it will cause an issue during setup.
-
This is NOT currently required. Currently included as a feature but intended for future changes to replace
mongo-user
. The intention is to switch from one database user that does everything to multiple with different roles (e.g. one user for the logger submodule given only write permission to log data and another user for REST API with only read permissions). These permissions/roles are stored on the admin database and therefore user authentication is done against admin database.
-
-
backup: Use
mongodump
ex: (will backup orderbook&trades database to a file namedorderbooktrades.dump
)docker exec mongoDB-financialData sh -c 'mongodump --authenticationDatabase admin -u mongo-root-username -p mongo-root-password --db orderbook&trades --archive' > orderbooktrades.dump
-
-
- restore: Supply a mysql dump created using
mysqldump
to/restore
on new instance-
like mongodb restore, if you have a list of users, databases and permissions for certain databases add them to
/restore/restore-users
and they will be applied. -
Different username/database/permission combos must be placed as separate line items with the following format
username:password:database:db permissions (select, update, etc.)
ex: admin user which uses the password
password
and given ALL PRIVILEGES on database usersadmin:password:users:ALL PRIVILEGES
- currently there is no way to limit users to specific hosts using this method, they all use wildcard (%) as their host. Intended to be changed in the future.
- also can't limit users permissions to individual tables, permissions applied the entire database specified
-
- backup: Use
mysqldump
ex. backs up databaseorderbooktechanal
to a file namedorderbooktechanal.dump
(use--all-databases
in place of--databases orderbooktechanal
to back up everything)docker exec orderbookTechAnalysisUsers mysqldump -uroot -psecret --databases orderbooktechanal > orderbooktechanal.dump
- restore: Supply a mysql dump created using
A service that is run to coninuaosuly monitor a level 2 / 3 financial data stream, consolidate and log the information into a database.
The REST API backend that supplies the frontend website with (financial) data for display as well as user data for login etc.
Frontend website that is responsible for displaying data, the technical analysis tools etc.