Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test Suite #52

Open
codello opened this issue Mar 9, 2024 · 4 comments
Open

Test Suite #52

codello opened this issue Mar 9, 2024 · 4 comments
Assignees
Labels
in-discussion This suggestion is in discussion

Comments

@codello
Copy link
Contributor

codello commented Mar 9, 2024

Suggestion

I'd like to suggest that in addition to the formal specification we include test files in this repository that give examples of files that are considered spec-compliant or non-compliant.

Use case

A comprehensive test suite makes it much easier for developers to build spec-compliant implementations. Writing tests takes a lot of time and there is a high risk that edge cases of the spec go unnoticed. Being able to test an implementation against a set of known valid / invalid files helps to reduce misunderstandings and reduces the barrier of building implementations that are actually spec-compliant.

Extra info/examples/attachments

I'm not sure what the best way of implementing a test suite is. In the spoiler below I outline a first idea. I'm also not sure if this is the right repository for a test suite or if it should get its own repository.

Possible Implementation

This proposal is heavily inspired by the YAML test suite.

General considerations

I think there are two points of view on a test suite: One from the person writing the tests and one from the person using the suite to validate an implementation. For the person writing the tests it's very advantageous to have the test input and the expected result very close to each other (read: in the same file). From an implementor's point of view it's desirable to have the test input as an individual file that can be read as-is.

To satisfy both points of view I think we should have a build step for the test suite.

Structure of the test suite

The test files are placed in a test folder. The test suite is written using yaml files. These files contain the tests, the expected output and metadata about the tests (see below). This makes it easy to write tests.

During a build step these files are then transformed into a directory structure containing:

  • The raw test input (an UltraStar TXT file)
  • A JSON file containing the expected output, or
  • An error file, indicating that the input is expected to produce an error

Example

A test file could look like this. This example includes 2 test cases (one expected success and one expected failure).

name: Valid Song With 2 Notes
description: >-
    This is an optional description of the test case.
input: |
    #VERSION:1.0.0
    #title:Foobar
    #ARTIST:Barfoo
    : 15 2 2 1 Hello
    : 17 3 1 1  World
headers:
    VERSION: 1.0.0
    TITLE: Foobar
    ARTIST: Barfoo
P1:
    - {type: ":", start: 12, duration: 1, pitch: 2, text: "Hello"}
    - {type: ":", start: 17, duration: 3, pitch: 1, text: " World"}



name: Invalid Note
description: >-
    This is an example of a failing test case.
    The description could include helpful tips why this is not considered a valid input.
fail: true
input: |
    #TITLE:Foobar
    : 12 1 2
    : 31 3 2 1  World

Open Questions

I'm currently unsure about the following questions:

  • Should we include partial expected results (e.g. in the second case should we include expected headers)?
  • Is there a better way of encoding the expected parse results for note? This seems quite verbose
  • How can special characters in the input be encoded? I'm currently thinking that adding a replacement mechanism for \uXXXX sequences might be sensible to make test cases more understandable.

If there is interest in this feature I'm happy to submit a PR containing the build system and some first test cases. Subsequent cases can be added as the details of the spec are decided.

@marwin89
Copy link
Collaborator

Hi @codello, this sounds good to me please go ahead. I'm not a test engineer or someting but we really need a proper and standardized test suite - so I highly appreciate any reasonable efforts.
Let's put the test files in this repo for a start.

@marwin89 marwin89 moved this to In Discussion in UltraStar Song Format - Roadmap Mar 11, 2024
@marwin89 marwin89 added this to the Technical Fine-tuning milestone Mar 11, 2024
@marwin89 marwin89 added the in-discussion This suggestion is in discussion label May 6, 2024
@basisbit
Copy link
Member

basisbit commented May 7, 2024

There exists https://github.com/UltraStar-Deluxe/songs - why not use that for "good" samples and adjust them on demand as needed?

@codello
Copy link
Contributor Author

codello commented May 7, 2024

That's a really good resource, thank you.

I'm actually thinking more about edge cases that are relevant when implementing parsers for the format. Consider these two examples:

# VERSION  : 1.0.0
#RELATIVE: yes
* 1 2 3 Foo
- 12

Note the following:

  • Here the #RELATIVE: yes must be ignored, because the header has been removed in version 1.0.0. Because of this - 12 is syntactically valid here.
  • There is extraneous whitespace around the version headers which must be ignored
#VERSION:1.2.8
#title:Foo:Bar
#P1: Foo
#P01: Bar

Note the following:

  • The version 1.2.8 is not currently defined. Implementations supporting the 1.0.0 standard should still process this file correctly.
  • The #TITLE header is lower case and contains a colon. Implementations should be able to parse this correctly.
  • The headers #P1 and #P01 are different. In particular the value of #P01 should not overwrite the value of #P1

These are just some examples but there are a lot more edge cases that aren't immediately obvious. I'd like to build a test suite to cover these to hopefully make it easier for developers to test their implementations against the spec.
I realize that these edge cases are unlikely to appear in the wild. But I think this is can be a valuable part in ensuring that implementations interpret the spec correctly. (This potentially also relates to #32)

@Baklap4
Copy link
Collaborator

Baklap4 commented May 8, 2024

Also relates to #18

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
in-discussion This suggestion is in discussion
Projects
Status: In Discussion
Development

No branches or pull requests

4 participants