-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate Asn1EncodedDataRouter to use Spring Kafka #138
Conversation
Generated by sending tim to deposit tim endpoint and grabbing value produced to the `topic.Asn1EncoderOutput` topic via kafka-ui
…TimUnsecured_depositsToSdxTopicAndTimTmcFiltered
…ve DI and testability
…tImpl These can be used to wrap the calls to security services and allow easier mocking of responses. It also allows us to better shield ourselves from API contract changes. users of the client don't need to be aware of changes to the external API as all interactions can be encapsulated within the client
This moves the calls to security services from Asn1CommandManager to SecurityServicesClientImpl which allows for more testability, flexibility (mocking client calls), and modularity
…ether (make data dynamic)
…h refactor nor do they confirm behavior
The responsibility to submit data to topics is already contained within Asn1EncodedDataRouter, and Asn1CommandManager is not responsible for any kafka interactions otherwise, so the responsibility was moved to Asn1EncodedDataRouter
Updated method signatures and instance usage to specify ResponseEvent with Address as the generic type. This change improves type safety and aligns with best practices for clearer code readability and maintainability.
…ests - not working well with other tests
Renamed methods and variables for clarity in message encoding workflows. Updated references in test cases and adjusted test resources to align with naming changes. This improves code readability and better reflects the encoding stages.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good to me & the unit tests pass! Great job on this refactor!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good to me!
} | ||
|
||
/** | ||
* Sign the encoded TIM message and write to Kafka with an expiration time. | ||
* Constructs an XML representation of an Advisory Situation Data (ASD) message containing a | ||
* signed Traveler Information Message (TIM). Processes the provided service request and signed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: the ASD doesn't always have a signed TIM. We should update documentation to reflect it's just a TIM, and ensure our code paths are capable of delivering unsigned
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated the javadoc comment to remove the "signed" portion 3a07370. I also read through the execution path and can confirm that the documentation was inaccurate & the code supports both signed and unsigned TIMs.
} | ||
private String packageSignedTimIntoAsd(ServiceRequest request, String signedMsg) throws JsonProcessingException, ParseException { | ||
SDW sdw = request.getSdw(); | ||
SNMP snmp = request.getSnmp(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: We're only packing into ASD if we're depositing into the SDX, so we should ignore the SNMP portion here and just take the SDW.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated: efd5f3d
if (dataSigningEnabledSDW && request.getSdw() != null) { | ||
hexEncodedTim = signTIMAndProduceToExpireTopic(hexEncodedTim, consumedObj); | ||
} | ||
// no SDW in metadata (SNMP deposit only) -> sign MF -> send to RSU |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: this comment is a little misleading, we certainly could have SDW in metadata here and frequently do.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good callout. I've removed the comment in favor of "self-documenting" code 20a7b0a
|
||
hexEncodedTim = mfObj.getString(BYTES); | ||
JSONObject deposit = new JSONObject(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question(non-blocking): would it be worth making a POJO for this object that gets sent to the SdwDepositor and on to the SDX?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, it was simple enough to implement so I defined one in 7c03cce
…/spring-kafka/asn1-encoded-router
…sn1EncodedDataRouter
…eSignedTimIntoAsd
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything passes tests and works for the most part. The one thing that seems to no longer be occurring is that TIM messages are not being written to the topic.OdeTimJsonTMCFiltered topic. This seems to occur no matter how you configure the jpo-ode to sign TIMs. This seems incorrect. Am I missing something?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Re-tested with the security module running and everything is working correctly. My mistake on not running the sec module! Having both RSU and SDW set to unsigned still results in this behavior but this is apparently how the existing functionality works too.
Everything looks to be good to go.
…h from Asn1EncodedDataRouter" This reverts commit a00ab06.
…sitTim when dataSigningEnabledSDW is false Eliminated processing logic and tests related to unsigned TIM messages, as well as the conditional ASD element addition when dataSigningEnabledSDW is false. This streamlines the codebase by ensuring all TIM messages follow the secured flow, improving maintainability and alignment with production recommendations.
Added/replaced logic to handle RSU XML-to-JSON array normalization in `processRsusIfPresent`. Refactored metadata handling in `processUnsignedMessage` and enhanced test cases to include RSU-related input/output scenarios. This ensures proper handling of RSU data while improving test robustness.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks really good! I like the updates here. One other question I had is that I had to update the jpo-security-svcs module reference in the .gitmodules
file. Should this PR modify that reference so it is in line with what the current ODE expects?
Passed Checks:
- tests pass
- log files process properly
- All services start up through docker-compose
- REST created TIMs for SDW deposit functions properly
- REST created TIMs for SNMP deposit functions properly
…/spring-kafka/asn1-encoded-router
Co-authored-by: Michael7371 <[email protected]>
PR Details
Description
Date
class with usages ofLocalDateTime
(only in Asn1EncodedDataRouter) to allow for explicit usage of UTC timezone instead of relying on the system settings (they aren't consistent between local and CI, so it's reasonable to suspect there is inconsistency between hosting environments)NOTE: The interactions with the security service rely on the changes contained within: CDOT-CV/jpo-security-svcs#16. The jpo-security-svcs submodule points to the CDOT-CV dev branch in this PR
Related Issue
No related USDOT issue
Motivation and Context
Implementing Spring Kafka gives us better lifecycle management of producers and consumers, more reusable producer/consumer code, easier testability, and a more robust production-ready Kafka library. This is part of a more significant effort to replace our hand-rolled Kafka implementation with Spring Kafka. The previous changesets related to this effort are:
How Has This Been Tested?
For the live testing, I set the following environment variables on my system to enable signing and confirm all pieces played well together
To test the signing flow, I produced the test XML to the
topic.Asn1EncoderOutput
topic via the kafka-ui.Snippet from the locals during local testing:
Types of changes
Checklist:
ODE Contributing Guide
Helpful Documentation