Skip to content
This repository has been archived by the owner on Jul 26, 2024. It is now read-only.

Commit

Permalink
2.2.3 (#98)
Browse files Browse the repository at this point in the history
* feat: add option to override file owner setting (#94)

* Merge changes

* Change beta updater

* Dev branch changes

* Add check for SQLite3

* SQLite message formattitng

* sqlite

* repo checks

* sudoers file

* add sudo file

* sudoer auto

* overwrite not append

* sudo docs

* sudo instructions

* SUDO instructions

Co-authored-by: Dylan Praul <[email protected]>
Co-authored-by: Michael Stanclift <[email protected]>
  • Loading branch information
3 people authored Oct 2, 2020
1 parent 3a2eed9 commit d23ae53
Show file tree
Hide file tree
Showing 7 changed files with 133 additions and 39 deletions.
14 changes: 4 additions & 10 deletions ADVANCED.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@ Download the latest release from [GitHub](https://github.com/vmstan/gravity-sync

```bash
cd ~
wget https://github.com/vmstan/gravity-sync/archive/v2.2.2.zip
unzip v2.2.1.zip -d gravity-sync
wget https://github.com/vmstan/gravity-sync/archive/v2.2.3.zip
unzip v2.2.3.zip -d gravity-sync
cd gravity-sync
```

Expand Down Expand Up @@ -220,15 +220,9 @@ At the very least, I would recommend backing up your existing `gravity-sync` fol

### Development Builds

Starting in v1.7.2, you can easily flag if you want to receive the development branch of Gravity Sync when running the built in `./gravity-sync.sh update` function. Beginning in v1.7.4 `./gravity-sync.sh dev` will now toggle the dev flag on/off. No `touch` required, although it still works that way under the covers.
Starting in v1.7.2, you can easily flag if you want to receive the development branch of Gravity Sync when running the built in `./gravity-sync.sh update` function. Beginning in v1.7.4 `./gravity-sync.sh dev` will now toggle the dev flag on/off. Starting in v2.2.3, it will prompt you to select the development branch you want to use.

To manually adjust the flag, create an empty file in the `gravity-sync` folder called `dev` and afterwards the standard `./gravity-sync.sh update` function will apply the correct updates.

```bash
cd gravity-sync
touch dev
./gravity-sync.sh update
```
To manually adjust the flag, create an empty file in the `gravity-sync` folder called `dev` and then edit the file to include only one line `BRANCH='origin/x.x.x'` (where x.x.x is the development version you want to use) afterwards the standard `./gravity-sync.sh update` function will apply the correct updates.

Delete the `dev` file and update again to revert back to the stable/master branch.

Expand Down
17 changes: 13 additions & 4 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,16 +13,25 @@ This release also adds the `./gravity-sync.sh purge` function that will totally
- I found a markdown spellcheck utility for Visual Studio Code, and ran it against all my markdown files. I'm sorry, I don't spell good. 🤷‍♂️
- New Star Trek references.

#### 2.2.2

- Corrects another logical problem that prevented `custom.list` from being backed up and replicated, if it didn't already exist on the local Pi-hole.

#### 2.2.1

- Corrects issue with Smart Sync where it would fail if there was no `custom.list` already present on the local Pi-hole.
- Adds Pi-hole default directories to `gravity-sync.conf.example` file.
- Adds `RIHOLE_BIN` variable to specify different Pi-hole binary location on remote server.

#### 2.2.2

- Corrects another logical problem that prevented `custom.list` from being backed up and replicated, if it didn't already exist on the local Pi-hole.

#### 2.2.3

- Adds variable to easily override database/binary file owners, useful for container deployments. (Thanks @dpraul)
- Adds variable to easily override Pi-hole binary directory for remote host, seperate from local host. (Thanks @dpraul)
- Rewritten `dev` option now lets you select the branch to pull code against, allowing for more flexibility in updating against test versions of the code. The `beta` function introduced in 2.1.5 has now been removed.
- Validates existance of SQLite installation on local Pi-hole.
- Adds Gravity Sync permissions for running user to local `/etc/sudoer.d` file during `config` operation.
- Adds `./gravity-sync.sh sudo` function to create above file for existing setups, or to configure the remote Pi-hole by placing the installer files on that system. This is not required for existing functional installs, but this should also negate the need to give the Gravity Sync user NOPASSWD permissions to the entire system.

## 2.1

### The Backup Release
Expand Down
35 changes: 31 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,9 @@ Gravity Sync is not developed by or affiliated with the Pi-hole project. This is

- Pi-hole 5.0 (or higher) must already be installed on at least two systems, using any of the Linux distribution that Pi-hole is [certified to run on](https://docs.pi-hole.net/main/prerequesites/#supported-operating-systems).
- While it is possible to leverage container/Docker deployments of Pi-hole and Gravity Sync, this configuration is currently not officially supported. Instructions here assume a "native" installation of Pi-hole.
- You will need to make sure that you have password-less `SUDO` enabled for the user accounts on both the primary and secondary Pi-hole. Most of the pre-built images available for the Raspberry Pi already have this configured, but if you have your Pi-hole running in a virtual machine built from a generic ISO, you may need to [adjust this manually](https://linuxize.com/post/how-to-run-sudo-command-without-password/).
- You will need to make sure that you have `SUDO` enabled for the user accounts on both the primary and secondary Pi-hole. Most of the pre-built images available for the Raspberry Pi already have this configured. During configuration you will be prompted to enable this for your Gravity Sync user.
- Make sure `SSH` and `RSYNC` are installed on both the primary and secondary Pi-hole prior to installation. These two binaries are what does the heavy lifting between your Pi-hole nodes. In the past, Dropbear was supported but this has proven problematic. If you're using a ultra-lightweight Pi distribution (such as DietPi) that uses Dropbear by default, you will need to convert to OpenSSH as of Gravity Sync version 2.2.
- You will need to make sure that `SQLite3` is installed on both Pi-hole systems, in order for the backup and restore functions against the databases to completely successfully. This should be covered by the installation of Pi-hole or already installed on most Linux distros.

### Pi-hole Architecture

Expand All @@ -53,7 +54,31 @@ Starting with version 2.0, Gravity Sync will attempt to sync the Adlist database

## Installation

Login to your *secondary* Pi-hole, and run:
### Primary Pi-Hole

Minimal preperation is required (as of version 2.2.3) on your primary Pi-hole.

Login to your *primary* Pi-hole, and install a copy of the software there:

```bash
git clone https://github.com/vmstan/gravity-sync.git $HOME/gravity-sync
```

From your home directory, you should run `./gravity-sync/gravity-sync.sh sudo`

**Once this process has completed, you can remove the entire `gravity-sync` directory from the primary Pi-hole.**

```bash
rm $HOME/gravity-sync
```

After you have completed this step, log out of the *primary* Pi-hole.

### Secondary Pi-Hole

From this point forward, all operations will take place on your secondary Pi-hole.

Login to your *secondary* Pi-hole, and install a copy of the software there:

```bash
git clone https://github.com/vmstan/gravity-sync.git $HOME/gravity-sync
Expand All @@ -65,20 +90,22 @@ Proceed to the Configuration section.

## Configuration

After you install Gravity Sync to your server you will need to create a configuration file.
After you install Gravity Sync to your *secondary Pi-hole* you will need to create a configuration file.

```bash
cd $HOME/gravity-sync
./gravity-sync.sh config
```

This will guide you through the process of:

- Specifying the IP or DNS name of your primary Pi-hole.
- Specifying the IP or xDNS name of your primary Pi-hole.
- Specifying the SSH username to connect to your primary Pi-hole.
- Selecting the SSH authentication mechanism (key-pair or password.)
- Configuring your key-pair and applying it to your primary Pi-hole.
- Testing your authentication method, and testing RSYNC to the primary.
- Perform a backup of the existing Pi-hole database.
- Adding your Gravity Sync user to the local SUDO configuration to run passwordless.

The configuration will be saved as `gravity-sync.conf` in the same folder as the script. If you need to make adjustments to your settings in the future, you can edit this file or run the configuration tool to generate a new one.

Expand Down
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
2.2.2
2.2.3
4 changes: 3 additions & 1 deletion gravity-sync.conf.example
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,6 @@ REMOTE_PASS=''
# GRAVITY_FI=''
# CUSTOM_DNS=''
# PIHOLE_BIN=''
# RIHOLE_BIN=''
# RIHOLE_BIN=''
# FILE_OWNER=''
# REMOTE_FILE_OWNER=''
99 changes: 80 additions & 19 deletions gravity-sync.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ SCRIPT_START=$SECONDS

# GRAVITY SYNC BY VMSTAN #####################
PROGRAM='Gravity Sync'
VERSION='2.2.2'
VERSION='2.2.3'

# Execute from the home folder of the user who owns it (ex: 'cd ~/gravity-sync')
# For documentation or downloading updates visit https://github.com/vmstan/gravity-sync
Expand Down Expand Up @@ -42,6 +42,8 @@ GRAVITY_FI='gravity.db' # default Pi-hole database file
CUSTOM_DNS='custom.list' # default Pi-hole local DNS lookups
PIHOLE_BIN='/usr/local/bin/pihole' # default Pi-hole binary directory (local)
RIHOLE_BIN='/usr/local/bin/pihole' # default Pi-hole binary directory (remote)
FILE_OWNER='pihole:pihole' # default Pi-hole file owner and group (local)
REMOTE_FILE_OWNER='pihole:pihole' # default Pi-hole file owner and group (remote)

# OS Settings
BASH_PATH='/bin/bash' # default OS bash path
Expand Down Expand Up @@ -114,17 +116,14 @@ function show_target {
function update_gs {
if [ -f "$HOME/${LOCAL_FOLDR}/dev" ]
then
BRANCH='development'
elif [ -f "$HOME/${LOCAL_FOLDR}/beta" ]
then
BRANCH='beta'
source $HOME/${LOCAL_FOLDR}/dev
else
BRANCH='master'
BRANCH='origin/master'
fi

if [ "$BRANCH" = "development" ]
if [ "$BRANCH" != "origin/master" ]
then
MESSAGE="Pulling from origin/${BRANCH}"
MESSAGE="Pulling from ${BRANCH}"
echo_info
fi

Expand All @@ -141,7 +140,7 @@ function update_gs {
error_validate
MESSAGE="Applying Update"
echo_stat
git reset --hard origin/${BRANCH} >/dev/null 2>&1
git reset --hard ${BRANCH} >/dev/null 2>&1
error_validate
fi
}
Expand All @@ -168,8 +167,8 @@ function pull_gs_grav {
MESSAGE="Validating Settings of ${GRAVITY_FI}"
echo_stat

GRAVDB_OWN=$(ls -ld ${PIHOLE_DIR}/${GRAVITY_FI} | awk '{print $3 $4}')
if [ "$GRAVDB_OWN" != "piholepihole" ]
GRAVDB_OWN=$(ls -ld ${PIHOLE_DIR}/${GRAVITY_FI} | awk 'OFS=":" {print $3,$4}')
if [ "$GRAVDB_OWN" != "$FILE_OWNER" ]
then
MESSAGE="Validating Ownership on ${GRAVITY_FI}"
echo_fail
Expand All @@ -179,7 +178,7 @@ function pull_gs_grav {

MESSAGE="Setting Ownership on ${GRAVITY_FI}"
echo_stat
sudo chown pihole:pihole ${PIHOLE_DIR}/${GRAVITY_FI} >/dev/null 2>&1
sudo chown ${FILE_OWNER} ${PIHOLE_DIR}/${GRAVITY_FI} >/dev/null 2>&1
error_validate

MESSAGE="Continuing Validation of ${GRAVITY_FI}"
Expand Down Expand Up @@ -332,7 +331,7 @@ function push_gs_grav {
MESSAGE="Setting Ownership on ${GRAVITY_FI}"
echo_stat
CMD_TIMEOUT='15'
CMD_REQUESTED="sudo chown pihole:pihole ${PIHOLE_DIR}/${GRAVITY_FI}"
CMD_REQUESTED="sudo chown ${RFILE_OWNER} ${PIHOLE_DIR}/${GRAVITY_FI}"
create_sshcmd
}

Expand Down Expand Up @@ -626,8 +625,8 @@ function restore_gs {
MESSAGE="Validating Ownership on ${GRAVITY_FI}"
echo_stat

GRAVDB_OWN=$(ls -ld ${PIHOLE_DIR}/${GRAVITY_FI} | awk '{print $3 $4}')
if [ "$GRAVDB_OWN" == "piholepihole" ]
GRAVDB_OWN=$(ls -ld ${PIHOLE_DIR}/${GRAVITY_FI} | awk 'OFS=":" {print $3,$4}')
if [ "$GRAVDB_OWN" == "$FILE_OWNER" ]
then
echo_good
else
Expand All @@ -638,7 +637,7 @@ function restore_gs {

MESSAGE="Setting Ownership on ${GRAVITY_FI}"
echo_stat
sudo chown pihole:pihole ${PIHOLE_DIR}/${GRAVITY_FI} >/dev/null 2>&1
sudo chown ${FILE_OWNER} ${PIHOLE_DIR}/${GRAVITY_FI} >/dev/null 2>&1
error_validate
fi

Expand Down Expand Up @@ -872,6 +871,26 @@ function validate_ph_folders {
echo_good
}

## Validate SQLite3
function validate_sqlite3 {
MESSAGE="Validating SQLITE Installed on $HOSTNAME"
echo_stat
if hash sqlite3 2>/dev/null
then
MESSAGE="SQLITE3 Utility Detected"
echo_good
else
MESSAGE="SQLITE3 Utility Missing"
echo_warn

MESSAGE="Installing SQLLITE3 with ${PKG_MANAGER}"
echo_stat

${PKG_INSTALL} sqllite3 >/dev/null 2>&1
error_validate
fi
}

## Validate SSHPASS
function validate_os_sshpass {
SSHPASSWORD=''
Expand Down Expand Up @@ -1376,6 +1395,8 @@ function intent_validate {
# Configuration Management
## Generate New Configuration
function config_generate {
task_sudo

detect_ssh

MESSAGE="Creating ${CONFIG_FILE} from Template"
Expand Down Expand Up @@ -1492,6 +1513,7 @@ function config_generate {
echo_info

validate_os_sshpass
validate_sqlite3

detect_remotersync
}
Expand Down Expand Up @@ -1768,6 +1790,14 @@ function task_devmode {
echo_stat
touch $HOME/${LOCAL_FOLDR}/dev
error_validate

git branch -r

MESSAGE="Select Branch to Update Against"
echo_need
read INPUT_BRANCH

echo -e "BRANCH='${INPUT_BRANCH}'" >> $HOME/${LOCAL_FOLDR}/dev
fi

MESSAGE="Run UPDATE to apply changes"
Expand Down Expand Up @@ -1944,6 +1974,26 @@ function task_purge {
update_gs
}

## Sudo Creation Task
function task_sudo {
TASKTYPE='SUDO'
MESSAGE="${MESSAGE}: ${TASKTYPE} Requested"
echo_good

MESSAGE="Creating Sudoer.d Template"
echo_stat

NEW_SUDO_USER=$(whoami)
echo -e "${NEW_SUDO_USER} ALL=(ALL) NOPASSWD: ${PIHOLE_DIR}" > gs-nopasswd.sudo
error_validate

MESSAGE="Installing Sudoer.d File"
echo_stat

sudo install -m 0440 gs-nopasswd.sudo /etc/sudoers.d/gs-nopasswd
error_validate
}

## Backup Task
function task_backup {
TASKTYPE='BACKUP'
Expand Down Expand Up @@ -2086,6 +2136,7 @@ case $# in
show_target
validate_gs_folders
validate_ph_folders
validate_sqlite3
validate_os_sshpass

smart_gs
Expand All @@ -2103,6 +2154,7 @@ case $# in
show_target
validate_gs_folders
validate_ph_folders
validate_sqlite3
validate_os_sshpass

smart_gs
Expand All @@ -2118,6 +2170,7 @@ case $# in
show_target
validate_gs_folders
validate_ph_folders
validate_sqlite3
validate_os_sshpass

smart_gs
Expand All @@ -2133,6 +2186,7 @@ case $# in
show_target
validate_gs_folders
validate_ph_folders
validate_sqlite3
validate_os_sshpass

pull_gs
Expand All @@ -2148,6 +2202,7 @@ case $# in
show_target
validate_gs_folders
validate_ph_folders
validate_sqlite3
validate_os_sshpass

push_gs
Expand All @@ -2163,6 +2218,7 @@ case $# in
show_target
validate_gs_folders
validate_ph_folders
validate_sqlite3

restore_gs
exit
Expand All @@ -2184,9 +2240,9 @@ case $# in
task_devmode
;;

beta)
task_betamode
;;
# beta)
# task_betamode
# ;;

devmode)
task_devmode
Expand Down Expand Up @@ -2232,6 +2288,11 @@ case $# in
task_purge
;;

sudo)
task_sudo
exit_withchange
;;

*)
task_invalid
;;
Expand Down
Loading

0 comments on commit d23ae53

Please sign in to comment.