Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create OntologiesReviewWorkflow #2437

Merged
merged 3 commits into from
Sep 14, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 42 additions & 0 deletions docs/OntologiesReviewWorkflow
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
---
layout: doc
title: Ontology Review Process
---

This page deals with the process, policies, and guidelines for manually reviewing newly-submitted ontologies, with particular focus on operational aspects such as:

- which ontologies are reviewed, and in what order?
- how are reviewers chosen?
- what is the workflow for conducting a review?
- what are the criteria used?

Ontologies are reviewed manually after completing the automated evaluation. Though the automated evaluation checks for adherence to [OBO Foundry principles](http://www.obofoundry.org/principles/fp-000-summary.html), it cannot capture certain nuances outlined in the principles nor evaluate ontology quality. The purpose of the manual review is to check the ontology for issues that the NOR Dashboard does not cover.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ontologies are reviewed manually after completing the automated evaluation.

I guess "completing" implies "passing" but would it be good to state that explicitly? If the submitted ontology completes the automated evaluation and fails anything, it is not reviewed manually yet, right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might also be good to clarify by saying "Newly-submitted ontologies" (even though it does say that in the first sentence).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The purpose of the manual review is to check the ontology for issues that the NOR Dashboard does not cover.

This document doesn't seem to cover what happens after an ontology passes (or fails) the review. I assume that is intentional, but it affects things like my comment below that a passing ontology needs to say yes to (as currently worded) some of the criteria.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, completing was intended to mean passing. Fixed.
Also clarified that this applies to new ontologies as suggested.
I did indeed intentionally leave out what happens upon passing or failing, as this is covered elsewhere and is technically not part of the review process. This doc is intended to stay focused on the review process itself.


# Steps prior to manual review

Once an ontology is submitted to the OBO Foundry, it is added to the New Ontology Dashboard for automated evaluation. The steps involved in ontology submission and subsequent dashboard evaluation are described in detail in this [FAQ answer](http://obofoundry.org/faq/how-do-i-register-my-ontology.html) and links therein.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is added to the New Ontology Dashboard

As a reader, I would be curious how this happens -- is it completely automatic, or does a human have to add it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspect it is added by a person, but comments to this effect are outside the scope of this document. I only added a very brief synopsis of 'prior' steps because the dashboard is referred to later in the document, and I don't like bringing things up without prior introduction.


# Review priority

All submitted ontologies will be manually reviewed based on the order in which the ontology becomes compliant with the automated evaluation (that is, when the [New Ontology Request (NOR) Dashboard](http://obofoundry.org/obo-nor.github.io/dashboard/index.html) status is set to 'pass').
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"is set to" suggests misleadingly that someone sets it. Maybe say "...status is 'pass'"?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see it the same way (as a program can change statuses too) but I changed it anyway.


# Choosing reviewers

Reviewers are chosen on a rotating basis from the members of the [OBO Foundry Operations Committee](http://obofoundry.org/docs/Membership.html).

# Ontology review workflow

The manual process begins once a reviewer is assigned. The reviewer will assess key aspects of the ontology (see below) and report the findings both to the submitter (via the issue tracker ticket used for submission) and to the OBO Operations Committee. The latter will discuss the findings and make a recommendation for acceptance, revision, or rejection.

# Manual review criteria

Criteria for review include (but are not limited to):

1. Ontology scope - Do the terms fall within the ontology's stated target domain of knowledge? Was the ontology developed for a very specific purpose or community?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was the ontology developed for a very specific purpose or community?

But that's ok, right? That just makes it a domain ontology or whatever we ended up calling it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct, it is okay. The question is evaluated because the answer impacts certain other aspects of the review.

2. Terms with the new ontology prefix - Do the terms follow the OBO identifier scheme? Are there terms with the same <i>meaning</i> available in another OBO Foundry ontology? Is there another OBO Foundry ontology whose scope covers any of the new terms?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For some of these criteria (e.g. the first one here), the answer should be "yes", whereas for others (the second two), the answer should be "no". I know this because of my insider knowledge, but I think it would be better to word all the criteria so that they are criteria for inclusion, and should be answered "yes".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This part was lifted from the SOPs and thus was indeed meant for insiders. However, I agree that in this document the answers would be useful to users. Asking some of these questions in a manner that gives a 'yes' answer would make the wording more difficult to follow, so I opted instead to provide the hoped-for answer.

3. Correct use of imported terms - Does the ontology accurately reuse terms from other OBO ontologies? Are imported terms in appropriate hierarchies, and do they preserve the term's upper-level alignment? Are any additional axioms used for these terms correct in both a technical (e.g. passes reasoning) and substantive sense?
4. Basic review of axiomatic patterns - Are axioms generally highly complex? Are existential restrictions used correctly? (Typical mistakes include “R some (A and B and C)” to mean “(R some A and R some B and R some C)”).
5. Appropriate use of object properties - Are object properties used in a manner consistent with their definitions, domain, and range? (Examples of incorrect usage include those based on some interpretation of the label of the object property but not actually fitting the property definition or domain and range.)
6. Responsiveness to suggested changes - Have the developers been willing to fix any identified issues during the review?


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know this is kind of a sore point, but how about putting some indication of how long it typically takes for these reviews to be completed? (2 months?)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd rather not. This information is given in the response to submitters and can vary. It would be bad IMO to say '2 months' if it would be substantially shorter (or longer), or say something like 'we hope to have these done within 2 weeks but it might take longer' because people only remember the numbers, not the caveats.

Loading