Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Big Cleanup #60

Open
jameslkingsley opened this issue Feb 28, 2018 · 3 comments
Open

The Big Cleanup #60

jameslkingsley opened this issue Feb 28, 2018 · 3 comments

Comments

@jameslkingsley
Copy link
Owner

jameslkingsley commented Feb 28, 2018

At some point I think this whole codebase could do with a big cleanup.

  • Refine naming of controllers and switch to RESTful API approach
  • Remove jQuery and switch entirely to Vue
  • Switch to an SPA using Vue Router
  • Use Laravel's auth guards to handle permissions
  • Refactor controllers to form request objects to avoid huge methods
  • Update to latest dependencies
  • Overhaul design to use Tailwind
  • Add PHP tests for at least mission management
  • Nicely formatted emails instead of basic text
  • Improve mission uploading process
    • Catch more errors and more reliably
@jameslkingsley
Copy link
Owner Author

jameslkingsley commented Feb 28, 2018

Mission Uploading & Updating

Currently missions are uploaded to the server's local disk in order to extract the PBO and check for errors. If there are no errors it will continue to deploy the mission files (PBO and ZIP) to the cloud disk (whatever that might be).

This approach works well because you have the option of where you want to handle the extraction and validation, but also where to ultimately store the mission files. A downside right now is if you want to download the mission it will always call on Google Cloud. This should be handled in a way that allows local disk storage to still work. Also should scrap signing the download if from GCS, it's OTT.

The upload/update process is monolithic. This needs to be simplified down to three core parts:

  • Upload (save files to local disk)
  • Validation (check files for errors)
  • Save (save new record and files to publish disk)

The config parser is also not 100% reliable. It breaks on non-quoted strings in the mission.sqm, and also doesn't handle all pre-processor commands correctly. The checking of loadout files should also check for any reference to ACRE.

For mission validation there should be a class for each type of check. For example when the files are on the local disk ready for validation, it should attempt to decode the config.hpp, mission.sqm, and description.ext. If it succeeds it should run through a set list of classes passing in the decoded objects and the classes can then perform their validation, throwing an error if it fails.

Some validation classes that would be needed:

  • LoadoutsCannotContainACRE
  • NameCannotBeARCMF (Strict)
  • NoDuplicateIds
  • CoopMustHaveAI

It would also be useful if each validation class could choose to be strict, and if strict it would fail the upload, but if not strict it would let the upload continue but raise a warning for the mission testers.

@jameslkingsley
Copy link
Owner Author

Notifications

Discord notifications should be condensed a bit to avoid the long URLs. The web notifications are non-existent at the moment, but should also come back. Discord notifications should be able to be handled by Laravel's notification system.

@jameslkingsley
Copy link
Owner Author

Design

  • Ability for user to change their colour scheme (should be easy with CSS variables)
    • Preset colours
    • Custom HEX value
    • Dark Mode

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant