-
Notifications
You must be signed in to change notification settings - Fork 241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixing script bug #3651
Fixing script bug #3651
Conversation
reordering logic Fixing types updating setuptools upgrading pyyaml updating setuptools Adding try catch for k8s 1.22 Trying new setup tools Downgrading pyyaml again going back to old setuptools version using new yaml version docker-compose as apt-get installed docker-compose as apt-get installed docker-compose as apt-get installed docker-compose as apt-get installed docker-compose as apt-get installed docker-compose as apt-get installed no pinned version docker-compose pinning github Pinnin versiong
b7f5fa7
to
9881911
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probably don't want to ship this until we've figured out how to get the docker-compose bits working internally
paasta_tools/cli/cmds/spark_run.py
Outdated
@@ -1203,7 +1203,7 @@ def paasta_spark_run(args): | |||
document = POD_TEMPLATE.format( | |||
spark_pod_label=limit_size_with_hash(f"exec-{app_base_name}"), | |||
) | |||
parsed_pod_template = yaml.load(document) | |||
parsed_pod_template = yaml.full_load(document) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should probably be safe_load
if anything :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added
paasta_tools/kubernetes_tools.py
Outdated
try: | ||
|
||
if "beta" in desired_crd.api_version: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we check for equality rather than substring? e.g., if "v1beta1" == desired_crd.api_version
to make sure we don't accidentally leave future us a timebomb if we somehow forget to clean this up?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that is not the version, the version is a longer string if I'm not mistaken, that's why I left it like "in" instead of matching the whole thing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ack - my understanding is that this should be something like apiextensions/v1beta1
or apiextensions/v1
, but as long as we're diligent about cleaning up once this is done then it's not really a big deal :)
@@ -3901,9 +3907,9 @@ def update_crds( | |||
f"status: {exc.status}, reason: {exc.reason}" | |||
) | |||
log.debug(exc.body) | |||
success = False | |||
return False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a slight behavior change (previously, we'd attempt to apply all CRDs regardless of whether any of them failed) - do we potentially rely on the previous behavior anywhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you're right, the reason is: we need to succeed in ever CRD now, since now to the list of desired_crd we send all of those CRDs that need to be updated and not that "may get updated" since we know now which is the apiversion of each
paasta_tools/setup_kubernetes_crd.py
Outdated
) | ||
except ApiException as e: | ||
existing_crds_v1_beta1 = [] | ||
log.debug(e) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will this debug statment be deleted or do we want to reword this message to log.debug("Listing CRDs with apiextensions/v1beta1 not supported on this cluster, falling back to v1")
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like that log message, will change it now
paasta_tools/setup_kubernetes_crd.py
Outdated
metadata["labels"]["yelp.com/paasta_service"] = service | ||
metadata["labels"][paasta_prefixed("service")] = service | ||
|
||
if "beta" in crd_config["apiVersion"]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as above, should we maybe be checking for version equality?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added now
tox.ini
Outdated
@@ -90,7 +90,6 @@ basepython = python3.7 | |||
whitelist_externals = bash | |||
deps = | |||
urllib3<2.0 | |||
docker-compose=={[tox]docker_compose_version} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we'll need to figure out what to do internally as well: we'll need the docker-compose deb installed on jenkins + devboxes before we merge this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
alternatively, we can continue using docker-compose internally since our internal setup is guaranteed to only hit our pypi + we have things setup internally to grab wheels
...or maybe do the potentially painful thing and start removing the usage of pip-custom-platform in this repo so that we hopefully grab the pypi.org wheels and skip these shenanigans?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how can I achieve the first option? I need to deploy these changes AFAP since is blocking me from upgrading to k8s 1.22
paasta_tools/cli/cmds/spark_run.py
Outdated
@@ -1203,7 +1203,7 @@ def paasta_spark_run(args): | |||
document = POD_TEMPLATE.format( | |||
spark_pod_label=limit_size_with_hash(f"exec-{app_base_name}"), | |||
) | |||
parsed_pod_template = yaml.load(document) | |||
parsed_pod_template = yaml.safe(document) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
parsed_pod_template = yaml.safe(document) | |
parsed_pod_template = yaml.safe_load(document) |
.github/workflows/ci.yml
Outdated
- name: Install docker-compose | ||
run: sudo apt-get update && sudo apt-get install -y docker-compose=1.25.0-1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- name: Install docker-compose | |
run: sudo apt-get update && sudo apt-get install -y docker-compose=1.25.0-1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed, thnx for the catch
Fixing bug updating more than 1 CRD for operators
Updated python dependencies due to broken pipeline
PAASTA-17925