Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Created unified text file tokenizer.
We have multiple formats with very similar syntax: rig def (truck), odef, tobj, character (see RigsOfRods#2942). This new parser supports all these formats and adds new features: - support for "quoted strings with spaces". I first implemented them in the new .character fileformat and I decided I want them everywhere, notably .tobj so that procedural roads can have custom names and descriptions for a GPS-like navigation system. - Editable memory representation. The document is a vector of Tokens (types: linebreak, comment, keyword, string, number, boolean.) which keeps the exact sequence as in the file. Tokens can be easily added/modified/removed without writing any extra custom code. - Serializability - saving the (possibly modified) file is as simple as looping through the Tokens array and writing them to a file. No custom code needs to be written for any file format. - Ease of binding to AngelScript: a single API can modify any fileformat. All the operations the user needs is 1. insert token 2. modify token 3. delete token. Code changes: - TObjFileFormat.h: replaced char[] with std::string; renamed TObjFile to TObjDocument, renamed TObjParser to TObjReader. - TObjFileFormat.cpp: replaced scanf() with the new Reader API. - TerrainObjectManager.h: replaced char* with std::string in function args. - TerrainObjectManager.cpp: using the new Document API.
- Loading branch information