-
預設皆開源
Open source by default. -
不做家長式的媒體,不做議題上的價值高下判斷、排名
No paternalism. -
如要引入非政府公開資訊,必需是全民可參與編輯協作的或是候選人、民代、政黨自行編輯的
There is a necessary requirement to include nonofficial data:- All citizen could cooperate these data.
- These data are publish by candidates, councilors themself.
-
crawler
各縣市議會的crawler:doc -
data
由上述crawler產出的各縣市原始JSONhashlist_meeting_minutes-v141001.json: links map, 存放由meeting_minutes cralwer抓下的binaries detail
candidates_2014.xlsx: 中選會公告的議員候選人
cand-moi-direct-control-2018.json: 直轄市議員
cand-moi-county-control-2018.json: 縣市議員
T1.csv: 2014中選會釋出資料
議員選後消失去哪了 -
parser
將上述data下的JSON標準化後放入database:doc -
voter_guide
Web application using Django, Environment Setup
0.1 install basic tools
sudo apt-get update
sudo apt-get upgrade
sudo reboot
sudo apt-get install git python-pip python-dev python-setuptools postgresql libpq-dev
sudo easy_install virtualenv
0.2 set a password in your database(If you already have one, just skip this step)
sudo -u <username> psql -c "ALTER USER <username> with encrypted PASSWORD 'put_your_password_here';"
e.g.
sudo -u postgres psql -c "ALTER USER postgres with encrypted PASSWORD 'my_password';"
It is quite big now. please be patient. don't use command like git --depth
git clone https://github.com/g0v/councilor-voter-guide.git
cd councilor-voter-guide/voter_guide/
(if you don' mind packages installed into your local environment, just pip install -r requirements.txt
)
virtualenv --no-site-packages venv
. venv/bin/activate
pip install -r requirements.txt
We use Postgres 9.5, please set your database config in voter_guide/local_settings.py.
Please create a database(e.g. voter_guide), below will use voter_guide for example
createdb -h localhost -U <username> voter_guide
pg_restore --verbose --clean --no-acl --no-owner -h localhost -U <username> -d voter_guide local_db.dump
python manage.py runserver
Now you should able to see the web page at http://localhost:8000
python manage.py dumpdata --exclude auth.permission --exclude contenttypes > db.json
There are some python package written in C or C++ such as lxml. so a compiler is required. you can install a compiler via the following command:
xcode-select --install
You can install the packaged app here. put the app in your Application folder and click it to start.
And please add the following line to your ~/.bash_profile
export PATH=/Applications/Postgres.app/Contents/Versions/9.3/bin/:$PATH
please change the version number 9.3 if you download a different version of PostgreSQL.
after you add the PATH environment variable, source it.
source ~/.bash_profile
if you don't add the PATH variable, installation of psycopg2 will not success.
Web Docker c3h3 / g0v-cvg-web
git clone https://github.com/c3h3/g0v-cvg-pgdata.git && cd g0v-cvg-pgdata && tar xfzv 47821274c242ce68f2d8d18d4bb0d050d6481311.tar.gz
- After that, you will get pgdata dir.
- Assume pgdata's absolute path is "your_pgdata"
docker run --name pgdb -v your_pgdata:/var/lib/postgresql/data postgres:9.3
If you want to use pgadmin connect with your db, you could also forwarding the port out ... with command ...
docker run --name pgdb -p 5432:5432 -v your_pgdata:/var/lib/postgresql/data postgres:9.3
- "your_pgdata" is pgdata's absolute path in previous step.
docker run --name g0v-cvg-web --link pgdb:postgres -p port_on_host:8000 -d c3h3/g0v-cvg-web
- "port_on_host" is the port forwarding out to your host, which you could find your web on http://localhost:port_on_host
Crawler Docker c3h3 / g0v-cvg-crawler
docker run --name g0v -p forward_port:6800 -v outside_items:/items -v outside_logs:/logs -d c3h3/g0v-cvg-crawler
- "forward_port" is the port you want to forward into docker image (EXPOSE 6800)
- "outside_items" is the directory you want to mount into docker image as /items
- "outside_logs" is the directory you want to mount into docker image as /logs
docker run --link g0v:g0v -it c3h3/g0v-cvg-crawler /bin/bash
in a running docker instance which linked with g0v (Scarpy Server), you can use the following command to deploy tcc crawler to server:
cd /tmp/g0v-cvg/crawler/tcc && python deploy.py
in a running docker instance which linked with g0v (Scarpy Server), you can use the following command to deploy tcc crawler to server:
cd /tmp/g0v-cvg/crawler/bin && python crawl_tcc_bills.py
CC0 1.0 Universal
This work is published from Taiwan.