Skip to content

Conversation

pavithra-suresh
Copy link

Summary
Adds an exporter that reads a .dat file and writes one JSON file per recording to a specified output directory. The JSON shape is designed for readability, diffs, Mermaid conversion, and downstream tooling.

Why
Make recordings easy to share, review, and automate
Enable Mermaid diagram generation from exported JSON

What’s included
Command entry point to export:
Input: a single .dat file or a directory of .dat files
Output: recording-.json files in the chosen directory
Pretty-printed JSON with stable field ordering
Iterative/streaming traversal to avoid large in-memory trees

Usage

Single file → multiple JSONs (one per recording)

ulyp-export
--in /path/to/recording.dat
--out /path/to/out

Example Json Format
{
"recordingId": "0001",
"startTimeEpochMs": 1757186574142,
"durationMs": 842,
"threadName": "http-nio-0.0.0.0-8080-exec-1",
"totalCalls" : 20,
"root": {
"method": "com.example.App#run",
"args": [{"name":"input","type":"String","value":"foo"}],
"return": {"type":"int","value":42},
"thrown": null,
"children": [
{
"method":"com.example.Service#compute",
"args":[{"type":"int","value":7}],
"return":{"type":"int","value":49},
"children":[]
}
]
}
}

Example downstream uses
Generate Mermaid sequence/class diagrams
Text diffs in PRs to spot behavior changes
Grep/jq for hotspot analysis, error patterns, or timing
Feed BI/reporting pipelines or flaky-test triage scripts

Tests
Golden test: small .dat → matches expected.json

No changes to existing runtime behavior
Export runs only when invoked

image

Copy link

Copy link
Owner

@cheb0 cheb0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Exporting to JSON is great idea! However, I'd propose a different way to implement that.

First, current implementation will create a recording json file for every call tree. Not everyone will want to do that.

Second, there is a caveat here. onRecordingUpdated might be called several times for a particular recording if it's big enough. This was made on purpose so UI can slowly "digest" a huge (several gigs) recording file and update the view incrementally.

So, to properly implement JSON export I suggest the following:

  • Have a new menu item added in PrimaryView.fxml (like "File" -> "Export to JSON")
  • PrimaryView.kt will have a new method to handle a menu event
  • This method will be similar to openRecordingFile - it should create a CallRecordTree instance. This will trigger reading from the recording file in a background thread
  • Listener (anonymous class which implements RecordingListener) should capture all Recording instances.
  • When reading is complete (there is a CompletableFuture to indicate that callRecordTree.completeFuture), we can take call tree root and export to JSON.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants