Skip to content

Commit

Permalink
Merge remote-tracking branch 'IQSS/develop' into multipid
Browse files Browse the repository at this point in the history
  • Loading branch information
qqmyers committed Jan 25, 2024
2 parents af03493 + e9215e3 commit 3a48834
Show file tree
Hide file tree
Showing 27 changed files with 339 additions and 52 deletions.
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,12 +56,12 @@ If you are interested in working on the main Dataverse code, great! Before you s

Please read http://guides.dataverse.org/en/latest/developers/version-control.html to understand how we use the "git flow" model of development and how we will encourage you to create a GitHub issue (if it doesn't exist already) to associate with your pull request. That page also includes tips on making a pull request.

After making your pull request, your goal should be to help it advance through our kanban board at https://github.com/orgs/IQSS/projects/2 . If no one has moved your pull request to the code review column in a timely manner, please reach out. Note that once a pull request is created for an issue, we'll remove the issue from the board so that we only track one card (the pull request).
After making your pull request, your goal should be to help it advance through our kanban board at https://github.com/orgs/IQSS/projects/34 . If no one has moved your pull request to the code review column in a timely manner, please reach out. Note that once a pull request is created for an issue, we'll remove the issue from the board so that we only track one card (the pull request).

Thanks for your contribution!

[dataverse-community Google Group]: https://groups.google.com/group/dataverse-community
[Community Call]: https://dataverse.org/community-calls
[dataverse-dev Google Group]: https://groups.google.com/group/dataverse-dev
[community contributors]: https://docs.google.com/spreadsheets/d/1o9DD-MQ0WkrYaEFTD5rF_NtyL8aUISgURsAXSL7Budk/edit?usp=sharing
[dev efforts]: https://github.com/orgs/IQSS/projects/2#column-5298405
[dev efforts]: https://github.com/orgs/IQSS/projects/34/views/6
5 changes: 5 additions & 0 deletions doc/release-notes/10216-metadatablocks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
The API endpoint `/api/metadatablocks/{block_id}` has been extended to include the following fields:

- `isRequired`: Whether or not this field is required
- `displayOrder`: The display order of the field in create/edit forms
- `typeClass`: The type class of this field ("controlledVocabulary", "compound", or "primitive")
4 changes: 4 additions & 0 deletions doc/release-notes/9275-harvest-invalid-query-params.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
OAI-PMH error handling has been improved to display a machine-readable error in XML rather than a 500 error with no further information.

- /oai?foo=bar will show "No argument 'verb' found"
- /oai?verb=foo&verb=bar will show "Verb must be singular, given: '[foo, bar]'"
1 change: 1 addition & 0 deletions doc/release-notes/9728-universe-variablemetadata.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
universe field in variablemetadata table was changed from varchar(255) to text. The change was made to support longer strings in "universe" metadata field, similar to the rest of text fields in variablemetadata table.
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Listing collction/dataverse role assignments via API still requires ManageDataversePermissions, but listing dataset role assignments via API now requires only ManageDatasetPermissions.
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/admin/integrations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ Future Integrations

The `Dataverse Project Roadmap <https://www.iq.harvard.edu/roadmap-dataverse-project>`_ is a good place to see integrations that the core Dataverse Project team is working on.

The `Community Dev <https://github.com/orgs/IQSS/projects/2#column-5298405>`_ column of our project board is a good way to track integrations that are being worked on by the Dataverse Community but many are not listed and if you have an idea for an integration, please ask on the `dataverse-community <https://groups.google.com/forum/#!forum/dataverse-community>`_ mailing list if someone is already working on it.
If you have an idea for an integration, please ask on the `dataverse-community <https://groups.google.com/forum/#!forum/dataverse-community>`_ mailing list if someone is already working on it.

Many integrations take the form of "external tools". See the :doc:`external-tools` section for details. External tool makers should check out the :doc:`/api/external-tools` section of the API Guide.

Expand Down
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1572,8 +1572,8 @@ The fully expanded example above (without environment variables) looks like this
Set Citation Date Field Type for a Dataset
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Sets the dataset citation date field type for a given dataset. ``:publicationDate`` is the default.
Note that the dataset citation date field type must be a date field.
Sets the dataset citation date field type for a given dataset. ``:publicationDate`` is the default.
Note that the dataset citation date field type must be a date field. This change applies to all versions of the dataset that have an entry for the new date field. It also applies to all file citations in the dataset.

.. code-block:: bash
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/developers/documentation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ If you find a typo or a small error in the documentation you can fix it using Gi
- Under the **Write** tab, delete the long welcome message and write a few words about what you fixed.
- Click **Create Pull Request**.

That's it! Thank you for your contribution! Your pull request will be added manually to the main Dataverse Project board at https://github.com/orgs/IQSS/projects/2 and will go through code review and QA before it is merged into the "develop" branch. Along the way, developers might suggest changes or make them on your behalf. Once your pull request has been merged you will be listed as a contributor at https://github.com/IQSS/dataverse/graphs/contributors
That's it! Thank you for your contribution! Your pull request will be added manually to the main Dataverse Project board at https://github.com/orgs/IQSS/projects/34 and will go through code review and QA before it is merged into the "develop" branch. Along the way, developers might suggest changes or make them on your behalf. Once your pull request has been merged you will be listed as a contributor at https://github.com/IQSS/dataverse/graphs/contributors

Please see https://github.com/IQSS/dataverse/pull/5857 for an example of a quick fix that was merged (the "Files changed" tab shows how a typo was fixed).

Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/developers/version-control.rst
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ Feedback on the pull request template we use is welcome! Here's an example of a
Make Sure Your Pull Request Has Been Advanced to Code Review
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Now that you've made your pull request, your goal is to make sure it appears in the "Code Review" column at https://github.com/orgs/IQSS/projects/2.
Now that you've made your pull request, your goal is to make sure it appears in the "Code Review" column at https://github.com/orgs/IQSS/projects/34.

Look at https://github.com/IQSS/dataverse/blob/master/CONTRIBUTING.md for various ways to reach out to developers who have enough access to the GitHub repo to move your issue and pull request to the "Code Review" column.

Expand Down
5 changes: 5 additions & 0 deletions docker-compose-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ services:
DATAVERSE_AUTH_OIDC_CLIENT_SECRET: 94XHrfNRwXsjqTqApRrwWmhDLDHpIYV8
DATAVERSE_AUTH_OIDC_AUTH_SERVER_URL: http://keycloak.mydomain.com:8090/realms/test
DATAVERSE_JSF_REFRESH_PERIOD: "1"
# These two oai settings are here to get HarvestingServerIT to pass
dataverse_oai_server_maxidentifiers: "2"
dataverse_oai_server_maxrecords: "2"
JVM_ARGS: -Ddataverse.files.storage-driver-id=file1
-Ddataverse.files.file1.type=file
-Ddataverse.files.file1.label=Filesystem
Expand Down Expand Up @@ -57,6 +60,8 @@ services:
volumes:
- ./docker-dev-volumes/app/data:/dv
- ./docker-dev-volumes/app/secrets:/secrets
# Uncomment to map the glassfish applications folder so that we can update webapp resources using scripts/intellij/cpwebapp.sh
# - ./docker-dev-volumes/glassfish/applications:/opt/payara/appserver/glassfish/domains/domain1/applications
# Uncomment for changes to xhtml to be deployed immediately (if supported your IDE or toolchain).
# Replace 6.0 with the current version.
# - ./target/dataverse-6.0:/opt/payara/deployments/dataverse
Expand Down
33 changes: 33 additions & 0 deletions scripts/intellij/cpwebapp.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
#!/usr/bin/env bash
#
# cpwebapp <project dir> <file in webapp>
#
# Usage:
#
# Add a File watcher by importing watchers.xml into IntelliJ IDEA, and let it do the copying whenever you save a
# file under webapp.
#
# https://www.jetbrains.com/help/idea/settings-tools-file-watchers.html
#
# Alternatively, you can add an External tool and trigger via menu or shortcut to do the copying manually:
#
# https://www.jetbrains.com/help/idea/configuring-third-party-tools.html
#

PROJECT_DIR=$1
FILE_TO_COPY=$2
RELATIVE_PATH="${FILE_TO_COPY#$PROJECT_DIR/}"

# Check if RELATIVE_PATH starts with 'src/main/webapp', otherwise ignore
if [[ $RELATIVE_PATH == src/main/webapp* ]]; then
# Get current version. Any other way to do this? A simple VERSION file would help.
VERSION=`perl -ne 'print $1 if /<revision>(.*?)<\/revision>/' ./modules/dataverse-parent/pom.xml`
RELATIVE_PATH_WITHOUT_WEBAPP="${RELATIVE_PATH#src/main/webapp/}"
TARGET_DIR=./docker-dev-volumes/glassfish/applications/dataverse-$VERSION
TARGET_PATH="${TARGET_DIR}/${RELATIVE_PATH_WITHOUT_WEBAPP}"

mkdir -p "$(dirname "$TARGET_PATH")"
cp "$FILE_TO_COPY" "$TARGET_PATH"

echo "File $FILE_TO_COPY copied to $TARGET_PATH"
fi
22 changes: 22 additions & 0 deletions scripts/intellij/watchers.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
<TaskOptions>
<TaskOptions>
<option name="arguments" value="$ProjectFileDir$ $FilePath$" />
<option name="checkSyntaxErrors" value="false" />
<option name="description" />
<option name="exitCodeBehavior" value="ERROR" />
<option name="fileExtension" value="*" />
<option name="immediateSync" value="false" />
<option name="name" value="Dataverse webapp file copy on save" />
<option name="output" value="" />
<option name="outputFilters">
<array />
</option>
<option name="outputFromStdout" value="false" />
<option name="program" value="$ProjectFileDir$/scripts/intellij/cpwebapp.sh" />
<option name="runOnExternalChanges" value="true" />
<option name="scopeName" value="Current File" />
<option name="trackOnlyRoot" value="false" />
<option name="workingDir" value="$ProjectFileDir$" />
<envs />
</TaskOptions>
</TaskOptions>
Original file line number Diff line number Diff line change
Expand Up @@ -432,7 +432,7 @@ public Long findCountByGuestbookId(Long guestbookId, Long dataverseId) {
Query query = em.createNativeQuery(queryString);
return (Long) query.getSingleResult();
} else {
String queryString = "select count(o) from GuestbookResponse as o, Dataset d, DvObject obj where o.dataset_id = d.id and d.id = obj.id and obj.owner_id = " + dataverseId + "and o.guestbook_id = " + guestbookId;
String queryString = "select count(o) from GuestbookResponse as o, Dataset d, DvObject obj where o.dataset_id = d.id and d.id = obj.id and obj.owner_id = " + dataverseId + " and o.guestbook_id = " + guestbookId;
Query query = em.createNativeQuery(queryString);
return (Long) query.getSingleResult();
}
Expand Down Expand Up @@ -914,7 +914,7 @@ public void save(GuestbookResponse guestbookResponse) {

public Long getDownloadCountByDataFileId(Long dataFileId) {
// datafile id is null, will return 0
Query query = em.createNativeQuery("select count(o.id) from GuestbookResponse o where o.datafile_id = " + dataFileId + "and eventtype != '" + GuestbookResponse.ACCESS_REQUEST +"'");
Query query = em.createNativeQuery("select count(o.id) from GuestbookResponse o where o.datafile_id = " + dataFileId + " and eventtype != '" + GuestbookResponse.ACCESS_REQUEST +"'");
return (Long) query.getSingleResult();
}

Expand All @@ -928,7 +928,7 @@ public Long getDownloadCountByDatasetId(Long datasetId, LocalDate date) {
if(date != null) {
query = em.createNativeQuery("select count(o.id) from GuestbookResponse o where o.dataset_id = " + datasetId + " and responsetime < '" + date.toString() + "' and eventtype != '" + GuestbookResponse.ACCESS_REQUEST +"'");
}else {
query = em.createNativeQuery("select count(o.id) from GuestbookResponse o where o.dataset_id = " + datasetId+ "and eventtype != '" + GuestbookResponse.ACCESS_REQUEST +"'");
query = em.createNativeQuery("select count(o.id) from GuestbookResponse o where o.dataset_id = " + datasetId+ " and eventtype != '" + GuestbookResponse.ACCESS_REQUEST +"'");
}
return (Long) query.getSingleResult();
}
Expand Down
60 changes: 55 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/HarvestingSetsPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,8 @@
import jakarta.faces.view.ViewScoped;
import jakarta.inject.Inject;
import jakarta.inject.Named;
import java.util.HashMap;
import java.util.Map;
import org.apache.commons.lang3.StringUtils;

/**
Expand Down Expand Up @@ -430,44 +432,92 @@ public boolean isSessionUserAuthenticated() {
return false;
}

// The numbers of datasets and deleted/exported records below are used
// in rendering rules on the page. They absolutely need to be cached
// on the first lookup.

Map<String, Integer> cachedSetInfoNumDatasets = new HashMap<>();

public int getSetInfoNumOfDatasets(OAISet oaiSet) {
if (oaiSet.isDefaultSet()) {
return getSetInfoNumOfExported(oaiSet);
}

if (cachedSetInfoNumDatasets.get(oaiSet.getSpec()) != null) {
return cachedSetInfoNumDatasets.get(oaiSet.getSpec());
}

String query = oaiSet.getDefinition();

try {
int num = oaiSetService.validateDefinitionQuery(query);
if (num > -1) {
cachedSetInfoNumDatasets.put(oaiSet.getSpec(), num);
return num;
}
} catch (OaiSetException ose) {
// do notghin - will return zero.
// do nothing - will return zero.
}
cachedSetInfoNumDatasets.put(oaiSet.getSpec(), 0);
return 0;
}

Map<String, Integer> cachedSetInfoNumExported = new HashMap<>();
Integer defaultSetNumExported = null;

public int getSetInfoNumOfExported(OAISet oaiSet) {
if (oaiSet.isDefaultSet() && defaultSetNumExported != null) {
return defaultSetNumExported;
} else if (cachedSetInfoNumExported.get(oaiSet.getSpec()) != null) {
return cachedSetInfoNumExported.get(oaiSet.getSpec());
}

List<OAIRecord> records = oaiRecordService.findActiveOaiRecordsBySetName(oaiSet.getSpec());

int num;

if (records == null || records.isEmpty()) {
return 0;
num = 0;
} else {
num = records.size();
}

return records.size();
if (oaiSet.isDefaultSet()) {
defaultSetNumExported = num;
} else {
cachedSetInfoNumExported.put(oaiSet.getSpec(), num);
}

return num;
}

Map<String, Integer> cachedSetInfoNumDeleted = new HashMap<>();
Integer defaultSetNumDeleted = null;

public int getSetInfoNumOfDeleted(OAISet oaiSet) {
if (oaiSet.isDefaultSet() && defaultSetNumDeleted != null) {
return defaultSetNumDeleted;
} else if (cachedSetInfoNumDeleted.get(oaiSet.getSpec()) != null) {
return cachedSetInfoNumDeleted.get(oaiSet.getSpec());
}

List<OAIRecord> records = oaiRecordService.findDeletedOaiRecordsBySetName(oaiSet.getSpec());

int num;

if (records == null || records.isEmpty()) {
return 0;
num = 0;
} else {
num = records.size();
}

return records.size();
if (oaiSet.isDefaultSet()) {
defaultSetNumDeleted = num;
} else {
cachedSetInfoNumDeleted.put(oaiSet.getSpec(), num);
}

return num;
}

public void validateSetQuery() {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ public class VariableMetadata implements Serializable {
/**
* universe: metadata variable field.
*/
@Column(columnDefinition="TEXT")
private String universe;

/**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,18 @@
import edu.harvard.iq.dataverse.engine.command.AbstractCommand;
import edu.harvard.iq.dataverse.engine.command.CommandContext;
import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
import edu.harvard.iq.dataverse.engine.command.RequiredPermissions;
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.Collections;

/**
*
* @author michael
*/
@RequiredPermissions( Permission.ManageDataversePermissions )
// no annotations here, since permissions are dynamically decided
public class ListRoleAssignments extends AbstractCommand<List<RoleAssignment>> {

private final DvObject definitionPoint;
Expand All @@ -34,5 +36,12 @@ public List<RoleAssignment> execute(CommandContext ctxt) throws CommandException
}
return ctxt.permissions().assignmentsOn(definitionPoint);
}

@Override
public Map<String, Set<Permission>> getRequiredPermissions() {
return Collections.singletonMap("",
definitionPoint.isInstanceofDataset() ? Collections.singleton(Permission.ManageDatasetPermissions)
: Collections.singleton(Permission.ManageDataversePermissions));
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
import jakarta.inject.Named;
import jakarta.persistence.EntityManager;
import jakarta.persistence.PersistenceContext;
import jakarta.persistence.Query;
import org.apache.solr.client.solrj.SolrQuery;
import org.apache.solr.client.solrj.SolrServerException;
import org.apache.solr.client.solrj.impl.BaseHttpSolrClient.RemoteSolrException;
Expand Down Expand Up @@ -121,6 +122,25 @@ public List<OAISet> findAllNamedSets() {
}
}

/**
* "Active" sets are the ones that have been successfully exported, and contain
* a non-zero number of records. (Although a set that contains a number of
* records that are all marked as "deleted" is still an active set!)
* @return list of OAISets
*/
public List<OAISet> findAllActiveNamedSets() {
String jpaQueryString = "select object(o) "
+ "from OAISet as o, OAIRecord as r "
+ "where r.setName = o.spec "
+ "and o.spec != '' "
+ "group by o order by o.spec";

Query query = em.createQuery(jpaQueryString);
List<OAISet> queryResults = query.getResultList();

return queryResults;
}

@Asynchronous
public void remove(Long setId) {
OAISet oaiSet = find(setId);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.MailUtil;
import edu.harvard.iq.dataverse.util.SystemConfig;
import io.gdcc.xoai.exceptions.BadVerbException;
import io.gdcc.xoai.exceptions.OAIException;
import io.gdcc.xoai.model.oaipmh.Granularity;
import io.gdcc.xoai.services.impl.SimpleResumptionTokenFormat;
Expand All @@ -48,6 +49,7 @@
import jakarta.servlet.http.HttpServlet;
import jakarta.servlet.http.HttpServletRequest;
import jakarta.servlet.http.HttpServletResponse;
import java.util.Map;
import javax.xml.stream.XMLStreamException;
import org.eclipse.microprofile.config.Config;
import org.eclipse.microprofile.config.ConfigProvider;
Expand Down Expand Up @@ -256,10 +258,16 @@ private void processRequest(HttpServletRequest httpServletRequest, HttpServletRe
"Sorry. OAI Service is disabled on this Dataverse node.");
return;
}

RawRequest rawRequest = RequestBuilder.buildRawRequest(httpServletRequest.getParameterMap());

OAIPMH handle = dataProvider.handle(rawRequest);

Map<String, String[]> params = httpServletRequest.getParameterMap();
OAIPMH handle;
try {
RawRequest rawRequest = RequestBuilder.buildRawRequest(params);
handle = dataProvider.handle(rawRequest);
} catch (BadVerbException bve) {
handle = dataProvider.handle(params);
}

response.setContentType("text/xml;charset=UTF-8");

try (XmlWriter xmlWriter = new XmlWriter(response.getOutputStream(), repositoryConfiguration);) {
Expand Down
Loading

0 comments on commit 3a48834

Please sign in to comment.