Skip to content

Commit

Permalink
Initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
rogerfraser committed Apr 9, 2024
1 parent 3c64621 commit adff8e4
Show file tree
Hide file tree
Showing 547 changed files with 1,197,378 additions and 1 deletion.
39 changes: 39 additions & 0 deletions .github/workflows/pancadastre.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
name: Generate visualisations and parse/validate

on:
workflow_dispatch:

jobs:
build:
runs-on: ubuntu-latest

steps:
- name: Checkout
uses: "actions/checkout@v3"
- name: Setup Python 3.11
uses: "actions/setup-python@v4"
with:
python-version: "3.11"

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install rdflib
pip install pyproj
pip install numpy
pip install scipy
- name: Fetch PanCadastre
run: git clone https://github.com/openwork-nz/pancadastre

- name: Parse extended_example.json
run: |
find build/tests -name '*.jsonld' | while read JSON_FILE; do python pancadastre/pancadastre.py -C "${JSON_FILE}" --interpolate -j "${JSON_FILE}-geojson.json" -k "${JSON_FILE}-parcels.json" -s "${JSON_FILE}-summary.txt" -e "${JSON_FILE}-errors.log"
done
rm -r pancadastre
# Commit new files back to repository
- name: Add and commit
uses: EndBug/add-and-commit@v9
with:
message: "Generate GeoJSON & summary files."
default_author: github_actions
63 changes: 63 additions & 0 deletions .github/workflows/process-bblocks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
name: Validate and process Building Block
on:
workflow_dispatch:
#push:
# branches:
# - master
# - main

permissions:
contents: write
pages: write
id-token: write

jobs:
validate-and-process:
uses: opengeospatial/bblocks-postprocess/.github/workflows/validate-and-process.yml@master
with:
skip-pages: true

build:
runs-on: ubuntu-latest
needs: validate-and-process

steps:
- name: Checkout
uses: "actions/checkout@v3"
with:
ref: ${{ github.ref_name }}
- name: Setup Python 3.11
uses: "actions/setup-python@v4"
with:
python-version: "3.11"

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install rdflib
pip install pyproj
- name: Fetch PanCadastre
run: git clone https://github.com/openwork-nz/pancadastre

- name: Parse all jsonld in build/tests output
run: |
find build/tests -name '*.jsonld' | while read JSON_FILE; do python pancadastre/pancadastre.py -C "${JSON_FILE}" -j "${JSON_FILE}-geojson.json" -s "${JSON_FILE}-summary.txt" -e "${JSON_FILE}-errors.log"
done
rm -r pancadastre
- name: Git pull before push
run: git pull

# Commit new files back to repository
- name: Add and commit
uses: EndBug/add-and-commit@v9
with:
message: "Generate GeoJSON & summary files."
default_author: github_actions

pages:
uses: opengeospatial/bblocks-postprocess/.github/workflows/validate-and-process.yml@master
needs: build
with:
ref: ${{ github.ref_name }}
skip-build: true
15 changes: 15 additions & 0 deletions .github/workflows/push-to-io.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
name: Push static to io

on:
workflow_dispatch:

permissions:
id-token: write
contents: write
pages: write

jobs:
deploy:
uses: opengeospatial/bblocks-postprocess/.github/workflows/validate-and-process.yml@master
with:
skip-build: true
33 changes: 33 additions & 0 deletions .github/workflows/uplift-all.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
name: Uplift all vocabulary files

on:
workflow_dispatch:

jobs:
uplift-all:
runs-on: ubuntu-latest
permissions:
contents: write

steps:
- uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v3
with:
python-version: '3.x'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install git+https://github.com/opengeospatial/ogc-na-tools.git
- name: Process files
run: |
#find . -name '*.csv' | while read CSV_FILE; do python -m ogc.na.ingest_json \
# --skip-on-missing-context --json-ld --context csv2python.yml "${CSV_FILE}" > "${CSV_FILE}.json"
# done
python -m ogc.na.ingest_json --batch --all --skip-on-missing-context \
--json-ld --ttl --work-dir . --domain-config .ogc/catalog.ttl
- name: Add and commit
uses: EndBug/add-and-commit@v9
with:
message: "Semantic uplift of all vocabulary source files"
default_author: github_actions
47 changes: 47 additions & 0 deletions .github/workflows/uplift-on-push.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
name: Semantic uplift on push

on:
workflow_dispatch:
inputs:
changed-files:
description: Changed files for processing
type: string
# push:
# branches:
# - main

jobs:
uplift:
runs-on: ubuntu-latest
permissions:
contents: write

steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@v35
with:
since_last_remote_commit: true
separator: ','
- name: Setup Python
uses: actions/setup-python@v3
with:
python-version: '3.x'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install git+https://github.com/opengeospatial/ogc-na-tools.git
- name: Process files
run: |
MOD_FILES=${{ steps.changed-files.outputs.all_changed_files }},${{ inputs.changed-files }}
echo Changed files: ${MOD_FILES}
python -m ogc.na.ingest_json --batch --skip-on-missing-context \
--json-ld --ttl --work-dir . --domain-config .ogc/catalog.ttl ${MOD_FILES}
- name: Add and commit
uses: EndBug/add-and-commit@v9
with:
message: "Semantic uplift on push"
default_author: github_actions
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
.idea/
build-local/
profiles/wa-vocab-bindings.csv~
profiles/wa-vocabs-todo.txt
50 changes: 50 additions & 0 deletions .ogc/catalog.ttl
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
@prefix dcfg: <http://www.example.org/ogc/domain-cfg#> .
@prefix dcat: <http://www.w3.org/ns/dcat#> .
@prefix dct: <http://purl.org/dc/terms/> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix profiles: <http://www.opengis.net/def/metamodel/profiles/> .

_:iso19157-3-sample a dcat:Catalog ;
dct:title "Profile vocabularies" ;
dcat:dataset _:vocabs-bindings , _:vocabs-labels, _:vocabs-uplift, _:examples ;
dcfg:breakme_hasProfileSource "sparql:http://defs-dev.opengis.net:8080/rdf4j-server/repositories/profiles",
".ogc/profiles.ttl" ;
dcfg:ignoreProfileArtifactErrors true ;
.


_:vocabs-uplift a dcat:Dataset, dcfg:UpliftConfiguration ;
dct:description "Profile vocabularies uplift" ;
dcfg:glob "vocabularies/**/*.csv" ;
dcfg:hasUpliftDefinition [
dcfg:order 1 ;
dcfg:file "vocabs-uplift.yml" ;
] ;
.

_:vocabs-bindings a dcat:Dataset, dcfg:UpliftConfiguration ;
dct:description "Profile vocabularies bindings to QB and SHACL rules" ;
dcfg:glob "profiles/*-vocab-bindings.csv" ;
dcfg:hasUpliftDefinition [
dcfg:order 1 ;
dcfg:file "vocabs-bindings.yml" ;
] ;
.

_:vocabs-labels a dcat:Dataset, dcfg:UpliftConfiguration ;
dct:description "Profile vocabularies Labels graph" ;
dcfg:glob "profiles/*-vocab-labels.csv" ;
dcfg:hasUpliftDefinition [
dcfg:order 1 ;
dcfg:file "vocabs-labels.yml" ;
] ;
.



_:examples a dcat:Dataset, dcfg:DomainConfiguration ;
dct:identifier "examples" ;
dct:description "Entailment and validation for examples" ;
dcfg:glob "build/tests/**/*.ttl" ;
dct:conformsTo profiles:skos_shared, profiles:skos_conceptscheme, profiles:skos_conceptscheme_ogc, profiles:vocprez_ogc ;
.
4 changes: 4 additions & 0 deletions .ogc/config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
json-downloads:
- url: https://docs.google.com/spreadsheets/d/1c2JY2J9oUvLeKe1Y-yWadY1pZYKe_7evHQ1SR41I_c0/export?format=csv
dest: tests/downloaded.csv
object-diff: false
29 changes: 29 additions & 0 deletions .ogc/profiles.ttl
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
@prefix prof: <http://www.w3.org/ns/dx/prof/> .
@prefix profiles: <http://www.opengis.net/def/metamodel/profiles/> .
@prefix role: <http://www.w3.org/ns/dx/prof/role/> .
@prefix xsd: <http://www.w3.org/2001/XMLSchema#> .
@prefix shacl: <http://www.w3.org/ns/shacl#> .
@prefix dct: <http://purl.org/dc/terms/> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .

profiles:vocprez_ogc a prof:Profile ;
prof:hasToken "vocprez_ogc"^^xsd:token ;
dct:description "OGC VocPrez SKOS profile" ;
rdfs:label "VocPrez OGC" ;
prof:isProfileOf profiles:skos_conceptscheme_ogc ;
prof:hasResource [
a prof:ResourceDescriptor ;
rdfs:label "SHACL Entailment" ;
dct:conformsTo shacl: ;
dct:format "text/turtle" ;
prof:hasArtifact <http://defs-dev.opengis.net/ogc-na/scripts/skos_vocprez.shapes.ttl> ;
prof:hasRole role:entailment
], [
a prof:ResourceDescriptor ;
rdfs:label "SHACL Validation" ;
dct:conformsTo shacl: ;
dct:format "text/turtle" ;
prof:hasArtifact <http://defs-dev.opengis.net/ogc-na/scripts/vocprez.shapes.ttl> ;
prof:hasRole role:validation
] ;
.
53 changes: 53 additions & 0 deletions PROFILES.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# Profile patterns


**Profiles** allow all the underlying details of base standards to be automatically included in testing and validation - this _encapsulates_ the underlying complexity of base specifications.

This dramatically **simplifies** profiles in terms of both development and usage, and ensures **consistency** and conformance of profiles with base specifications.

Typical types of profiles are a layering of general to specific, with specific benefits accruing to each layer in terms of reuse of software through to reuse of application data.
![](https://lucid.app/publicSegments/view/5bebb494-e12f-46c6-a633-9c4cb3f0ba56/image.png)


In particular, if base specifications use the OGC BuildingBlocks then profiles can _leverage_ all the effort in design, testing and validation capabilities.

Thus profiles also use the same structures, so they can be profiled in turn.

## What is a profile?

A profile defines a set of constraints on a base specification. Implementations of profiles conform to the base specification.

Because many technologies like JSON and RDF are permissive (by default) about additional information being present, definition of an *extension* is effectively defining a *constraint* on how additional information should be represented.

## Profiles of profiles...

Profiles can be designed as separate re-usable sets of constraints that can be reused - for example a time-series of water-quality monitoring observations could be specified as a profile of both a time-series profile of Observations and a water-quality profile for the results of such observations.
In turn the time-series profile could defined as data structure using GeoJSON, or Coverage JSON. The water-quality content requirements could be described using constraints independent of the data structure.

## What forms of constraints are possible?

The **OGC BuildingBlock** model supports a range of possible constraint approaches. The goal is to make such constraints **_machine-readable_** to the extent possible.

Constraints SHOULD be defined in a form that allows for **_validation_** of test cases and examples.

Built-in support is provided for automatic validation of the following forms:
- project metadata (description)
- well-formed example encoding (JSON, TTL)
- JSON schema (for JSON examples) for **structure**
- SHACL (Shapes Constraint Language for RDF) for **content** and **logical consistency**

In addition [custom validators](VALIDATORS) can be added to the validation workflow.

Using a JSON-LD context "semantic uplift" of JSON to RDF supports use of SHACL and other forms of validators to

## Testing

Test cases should be provided for each component part of a specification. This requires a minimal conformant **base example** that the specific test case can be added to.

(Note consideration is being given to making such a baseline example resuable by reference instead of duplication, and potentially derived automatically from declared schema)

Testing should start by validating the **base example** passes all declared constraints, then for each profile constraint:
- identifying a set of valid cases that should conform to the constraint, testing each aspect
- creating a copy of the base under the **/tests/** folder with a name indicating which constraint and case is being tested - e.g. **my-building-block/tests/mything-property-b-number.json**
- adding the specific example to the example
- creating one or more failure tests with **-fail** file name endings - e.g. **my-building-block/tests/mything-property-b-number-fail.json**
Loading

0 comments on commit adff8e4

Please sign in to comment.