Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(uptime): Propagate checker configs to redis as well as kafka #84151

Merged
merged 2 commits into from
Jan 28, 2025

Conversation

wedamija
Copy link
Member

This propagates our checker configs to our new redis config store. For the moment we're dual writing, but we'll remove the Kafka code once it is stable and running in prod.

@wedamija wedamija requested a review from a team as a code owner January 28, 2025 00:59
@github-actions github-actions bot added the Scope: Backend Automatically applied to PRs that change backend components label Jan 28, 2025
Copy link

codecov bot commented Jan 28, 2025

❌ 1 Tests Failed:

Tests completed Failed Passed Skipped
23685 1 23684 292
View the top 1 failed tests by shortest run time
tests.sentry.api.test_api_owners.APIOwnersTestCase::test_api_owner_owns_api
Stack Traces | 0.041s run time
#x1B[1m#x1B[.../sentry/api/test_api_owners.py#x1B[0m:22: in test_api_owner_owns_api
    assert owner.value in self.teams
#x1B[1m#x1B[31mE   AssertionError: assert 'open-source' in {'alerts-create-issues', 'alerts-notifications', 'core-product-foundations', 'crons', 'dashboards', 'data', ...}#x1B[0m
#x1B[1m#x1B[31mE    +  where 'open-source' = <ApiOwner.OPEN_SOURCE: 'open-source'>.value#x1B[0m
#x1B[1m#x1B[31mE    +  and   {'alerts-create-issues', 'alerts-notifications', 'core-product-foundations', 'crons', 'dashboards', 'data', ...} = <tests.sentry.api.test_api_owners.APIOwnersTestCase testMethod=test_api_owner_owns_api>.teams#x1B[0m

To view more test analytics, go to the Test Analytics Dashboard
📢 Thoughts on this report? Let us know!

Comment on lines +82 to +84
def get_partition_keys(subscription_id: UUID) -> tuple[str, str]:
partition = get_partition_from_subscription_id(subscription_id)
return f"uptime:configs:{partition}", f"uptime:updates:{partition}"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Sort of prefer this to be two functions, just so the naming is a bit more clear.

though you can pretty much tell from looking at the return strings, and at the call site what the unwrapped values are

pipe = cluster.pipeline()
if value is None:
pipe.hdel(config_key, key)
action = "delete"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be nice to have a type for this, maybe a dataclass that knows how to serialize itself

@dataclass
class ConfigUpdate():
  action: Literal["delete" | "upsert"]
  subscription_id: Uuid

  def as_msgpack(self):
      return msgpack.packb(
          {
              "action": self.action,
              "subscription_id": self.subscription_id.hex,
          }
      )

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

potentially could be redis aware as well

@property
def redis_update_key(self):
  partition = get_partition_from_subscription_id(self.subscription_id)
  return f"uptime:updates:{partition}"

Just a thought

This propagates our checker configs to our new redis config store. For the moment we're dual writing, but we'll remove the Kafka code once it is stable and running in prod.
@evanpurkhiser evanpurkhiser force-pushed the danf/uptime-config-propagate-redis branch from 11594aa to 4676e0a Compare January 28, 2025 21:13
@wedamija wedamija merged commit 6356183 into master Jan 28, 2025
48 checks passed
@wedamija wedamija deleted the danf/uptime-config-propagate-redis branch January 28, 2025 21:41
andrewshie-sentry pushed a commit that referenced this pull request Jan 29, 2025
…4151)

This propagates our checker configs to our new redis config store. For
the moment we're dual writing, but we'll remove the Kafka code once it
is stable and running in prod.

<!-- Describe your PR here. -->
c298lee pushed a commit that referenced this pull request Jan 29, 2025
…4151)

This propagates our checker configs to our new redis config store. For
the moment we're dual writing, but we'll remove the Kafka code once it
is stable and running in prod.

<!-- Describe your PR here. -->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Scope: Backend Automatically applied to PRs that change backend components
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants