-
Notifications
You must be signed in to change notification settings - Fork 500
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'develop' into 10519-dataset-types #10519
Conflicts: src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java
- Loading branch information
Showing
86 changed files
with
6,367 additions
and
192 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
CRUD endpoints for Collection Featured Items have been implemented. In particular, the following endpoints have been implemented: | ||
|
||
- Create a feature item (POST /api/dataverses/<dataverse_id>/featuredItems) | ||
- Update a feature item (PUT /api/dataverseFeaturedItems/<item_id>) | ||
- Delete a feature item (DELETE /api/dataverseFeaturedItems/<item_id>) | ||
- List all featured items in a collection (GET /api/dataverses/<dataverse_id>/featuredItems) | ||
- Delete all featured items in a collection (DELETE /api/dataverses/<dataverse_id>/featuredItems) | ||
- Update all featured items in a collection (PUT /api/dataverses/<dataverse_id>/featuredItems) | ||
|
||
New settings: | ||
|
||
- dataverse.files.featured-items.image-maxsize - It sets the maximum allowed size of the image that can be added to a featured item. | ||
- dataverse.files.featured-items.image-uploads - It specifies the name of the subdirectory for saving featured item images within the docroot directory. | ||
|
||
See also #10943 and #11124. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,60 @@ | ||
### New 3D Object Data Metadata Block | ||
|
||
A new metadata block has been added for describing 3D object data. You can download it from the [guides](https://dataverse-guide--11167.org.readthedocs.build/en/11167/user/appendix.html). See also #11120 and #11167. | ||
|
||
All new Dataverse installations will receive this metadata block by default. We recommend adding it by following the upgrade instructions below. | ||
|
||
## Upgrade Instructions | ||
|
||
### For 6.6-Release-notes.md | ||
|
||
6\. Restart Payara | ||
|
||
7\. Update metadata blocks | ||
|
||
These changes reflect incremental improvements made to the handling of core metadata fields. | ||
|
||
```shell | ||
wget https://raw.githubusercontent.com/IQSS/dataverse/v6.6/scripts/api/data/metadatablocks/citation.tsv | ||
|
||
curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/tab-separated-values" -X POST --upload-file citation.tsv | ||
``` | ||
```shell | ||
wget https://raw.githubusercontent.com/IQSS/dataverse/v6.6/scripts/api/data/metadatablocks/3d_objects.tsv | ||
|
||
curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/tab-separated-values" -X POST --upload-file 3d_objects.tsv | ||
``` | ||
|
||
8\. Update Solr schema.xml file. Start with the standard v6.6 schema.xml, then, if your installation uses any custom or experimental metadata blocks, update it to include the extra fields (step 8a). | ||
|
||
Stop Solr (usually `service solr stop`, depending on Solr installation/OS, see the [Installation Guide](https://guides.dataverse.org/en/6.6/installation/prerequisites.html#solr-init-script)). | ||
|
||
```shell | ||
service solr stop | ||
``` | ||
|
||
Replace schema.xml | ||
|
||
```shell | ||
wget https://raw.githubusercontent.com/IQSS/dataverse/v6.6/conf/solr/schema.xml | ||
cp schema.xml /usr/local/solr/solr-9.4.1/server/solr/collection1/conf | ||
``` | ||
|
||
Start Solr (but if you use any custom metadata blocks or adding 3D Objects, perform the next step, 8a first). | ||
|
||
```shell | ||
service solr start | ||
``` | ||
8a\. For installations with custom or experimental metadata blocks: | ||
|
||
Before starting Solr, update the schema to include all the extra metadata fields that your installation uses. We do this by collecting the output of the Dataverse schema API and feeding it to the `update-fields.sh` script that we supply, as in the example below (modify the command lines as needed to reflect the names of the directories, if different): | ||
|
||
```shell | ||
wget https://raw.githubusercontent.com/IQSS/dataverse/v6.6/conf/solr/update-fields.sh | ||
chmod +x update-fields.sh | ||
curl "http://localhost:8080/api/admin/index/solr/schema" | ./update-fields.sh /usr/local/solr/solr-9.4.1/server/solr/collection1/conf/schema.xml | ||
``` | ||
|
||
Now start Solr. | ||
|
||
9\. Reindex Solr |
5 changes: 5 additions & 0 deletions
5
doc/release-notes/11127-update-search-api-to-show-dvobject-type-counts.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
### show-type-counts Behavior changed in Search API | ||
|
||
In the Search API if you set show_type_counts=true the response will include all object types (Dataverses, Datasets, and Files) even if the search result for any given type is 0. | ||
|
||
See also the [guides](https://preview.guides.gdcc.io/en/develop/api/search.html#parameters), #11127 and #11138. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
This feature adds a new API to send feedback to the Collection, Dataset, or DataFile's contacts. | ||
Similar to the "admin/feedback" API the "sendfeedback" API sends an email to all the contacts listed for the Dataset. The main differences for this feature are: | ||
1. This API is not limited to Admins | ||
2. This API does not return the email addresses in the "toEmail" and "ccEmail" elements for privacy reasons | ||
3. This API can be rate limited to avoid spamming | ||
4. The body size limit can be configured | ||
5. The body will be stripped of any html code to prevent malicious scripts or links | ||
6. The fromEmail will be validated for correct format | ||
|
||
To set the Rate Limiting for guest users (See Rate Limiting Configuration for more details. This example allows 1 send per hour for any guest) | ||
``curl http://localhost:8080/api/admin/settings/:RateLimitingCapacityByTierAndAction -X PUT -d '[{\"tier\": 0, \"limitPerHour\": 1, \"actions\": [\"CheckRateLimitForDatasetFeedbackCommand\"]}]'`` | ||
|
||
To set the message size limit (example limit of 1080 chars): | ||
``curl -X PUT -d 1080 http://localhost:8080/api/admin/settings/:ContactFeedbackMessageSizeLimit`` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
The file page version table now shows more detail, e.g. when there are metadata changes or whether a file has been replaced. | ||
A bug that causes adding free-form provenance to a file to fail has been fixed. | ||
See also #11142 and #11145. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
### Preview URL popup updated | ||
|
||
The Preview URL popup and related documentation has been updated to give the dataset more information about anonymous access including the names of the dataset fields that will be withheld from the url user and suggesting how to review the url before releasing it. See also #11159 and #11164. | ||
|
||
###Bug Fix | ||
|
||
Bug which causes users of the Anonymous Review URL to have some metadata of published datasets withheld has been fixed. See #11202 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
bugfix: openaire implementation can now correctly process one or multiple productionPlaces as geolocation |
13 changes: 13 additions & 0 deletions
13
doc/sphinx-guides/source/_static/api/dataset-add-single-compound-field-metadata.json
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
{ | ||
"fields": [ | ||
{ | ||
"typeName": "targetSampleSize", | ||
"value": { | ||
"targetSampleActualSize": { | ||
"typeName": "targetSampleSizeFormula", | ||
"value": "n = N*X / (X + N – 1)" | ||
} | ||
} | ||
} | ||
] | ||
} |
4 changes: 4 additions & 0 deletions
4
doc/sphinx-guides/source/_static/api/dataset-add-single-cvoc-field-metadata.json
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
{ | ||
"typeName": "journalArticleType", | ||
"value": "abstract" | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
16 changes: 16 additions & 0 deletions
16
doc/sphinx-guides/source/_static/api/transform-oai-ore-jsonld.xq
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
declare option output:method "json"; | ||
|
||
let $parameters:={ 'method': 'json' } | ||
for $record in /json | ||
let $metadata:=$record/ore_003adescribes | ||
|
||
|
||
let $json:= | ||
<json type="object"> | ||
{$metadata/*} | ||
{$record/_0040context} | ||
</json> | ||
|
||
|
||
return if ($metadata) then | ||
file:write("converted.json",$json, $parameters) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.