Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing script bug #3651

Merged
merged 6 commits into from
Jul 18, 2023
Merged

Conversation

wilmer05
Copy link
Contributor

@wilmer05 wilmer05 commented Jul 18, 2023

Fixing bug updating more than 1 CRD for operators
Updated python dependencies due to broken pipeline

PAASTA-17925

reordering logic

Fixing types

updating setuptools

upgrading pyyaml

updating setuptools

Adding try catch for k8s 1.22

Trying new setup tools

Downgrading pyyaml again

going back to old setuptools version

using new yaml version

docker-compose as apt-get installed

docker-compose as apt-get installed

docker-compose as apt-get installed

docker-compose as apt-get installed

docker-compose as apt-get installed

docker-compose as apt-get installed

no pinned version docker-compose

pinning github

Pinnin versiong
@wilmer05 wilmer05 force-pushed the u/wilmerrafael/PAASTA_17925_fix_paasta_bug_for_crds branch from b7f5fa7 to 9881911 Compare July 18, 2023 16:08
Copy link
Member

@nemacysts nemacysts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably don't want to ship this until we've figured out how to get the docker-compose bits working internally

@@ -1203,7 +1203,7 @@ def paasta_spark_run(args):
document = POD_TEMPLATE.format(
spark_pod_label=limit_size_with_hash(f"exec-{app_base_name}"),
)
parsed_pod_template = yaml.load(document)
parsed_pod_template = yaml.full_load(document)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should probably be safe_load if anything :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added

try:

if "beta" in desired_crd.api_version:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we check for equality rather than substring? e.g., if "v1beta1" == desired_crd.api_version to make sure we don't accidentally leave future us a timebomb if we somehow forget to clean this up?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that is not the version, the version is a longer string if I'm not mistaken, that's why I left it like "in" instead of matching the whole thing

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ack - my understanding is that this should be something like apiextensions/v1beta1 or apiextensions/v1, but as long as we're diligent about cleaning up once this is done then it's not really a big deal :)

@@ -3901,9 +3907,9 @@ def update_crds(
f"status: {exc.status}, reason: {exc.reason}"
)
log.debug(exc.body)
success = False
return False
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a slight behavior change (previously, we'd attempt to apply all CRDs regardless of whether any of them failed) - do we potentially rely on the previous behavior anywhere?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you're right, the reason is: we need to succeed in ever CRD now, since now to the list of desired_crd we send all of those CRDs that need to be updated and not that "may get updated" since we know now which is the apiversion of each

)
except ApiException as e:
existing_crds_v1_beta1 = []
log.debug(e)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will this debug statment be deleted or do we want to reword this message to log.debug("Listing CRDs with apiextensions/v1beta1 not supported on this cluster, falling back to v1")?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like that log message, will change it now

metadata["labels"]["yelp.com/paasta_service"] = service
metadata["labels"][paasta_prefixed("service")] = service

if "beta" in crd_config["apiVersion"]:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as above, should we maybe be checking for version equality?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added now

paasta_tools/setup_kubernetes_crd.py Show resolved Hide resolved
tox.ini Outdated
@@ -90,7 +90,6 @@ basepython = python3.7
whitelist_externals = bash
deps =
urllib3<2.0
docker-compose=={[tox]docker_compose_version}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we'll need to figure out what to do internally as well: we'll need the docker-compose deb installed on jenkins + devboxes before we merge this

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

alternatively, we can continue using docker-compose internally since our internal setup is guaranteed to only hit our pypi + we have things setup internally to grab wheels

...or maybe do the potentially painful thing and start removing the usage of pip-custom-platform in this repo so that we hopefully grab the pypi.org wheels and skip these shenanigans?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how can I achieve the first option? I need to deploy these changes AFAP since is blocking me from upgrading to k8s 1.22

@@ -1203,7 +1203,7 @@ def paasta_spark_run(args):
document = POD_TEMPLATE.format(
spark_pod_label=limit_size_with_hash(f"exec-{app_base_name}"),
)
parsed_pod_template = yaml.load(document)
parsed_pod_template = yaml.safe(document)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
parsed_pod_template = yaml.safe(document)
parsed_pod_template = yaml.safe_load(document)

Comment on lines 39 to 40
- name: Install docker-compose
run: sudo apt-get update && sudo apt-get install -y docker-compose=1.25.0-1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- name: Install docker-compose
run: sudo apt-get update && sudo apt-get install -y docker-compose=1.25.0-1

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed, thnx for the catch

@wilmer05 wilmer05 merged commit c596569 into master Jul 18, 2023
1 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants