Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

migrate cluster info module from community #80

Conversation

mikemorency
Copy link
Collaborator

SUMMARY

This migrates the vmware_cluster_info module from the community.vmware collection to this collection.
The inputs are mostly the same. There is one input that was renamed to match the guest_info module (gather_tags).
The outputs are mostly the same. However, there are some changes worth noting:

  • I added DPM facts in (they were previously not gathered)
  • I fixed the output DPM and DRS rates so they match what is shown in the UI. Previously, the opposite value was shown
  • I standardized the "enabled" outputs. Now the outputs are <service>_enabled where service could be dpm, drs, ha, etc
ISSUE TYPE
  • Feature Pull Request
COMPONENT NAME

cluster_info

Copy link

codecov bot commented Nov 6, 2024

Codecov Report

Attention: Patch coverage is 27.09677% with 113 lines in your changes missing coverage. Please review.

Project coverage is 27.19%. Comparing base (013afa4) to head (bd3d8c4).
Report is 30 commits behind head on main.

Files with missing lines Patch % Lines
plugins/module_utils/_vmware_facts.py 18.60% 70 Missing ⚠️
plugins/modules/cluster_info.py 36.66% 38 Missing ⚠️
plugins/module_utils/_vmware_rest_client.py 33.33% 2 Missing ⚠️
plugins/module_utils/_vmware_folder_paths.py 50.00% 1 Missing ⚠️
plugins/modules/cluster_dpm.py 50.00% 1 Missing ⚠️
plugins/modules/cluster_drs.py 50.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main      #80      +/-   ##
==========================================
+ Coverage   26.09%   27.19%   +1.10%     
==========================================
  Files          19       23       +4     
  Lines        1667     2015     +348     
  Branches      331      375      +44     
==========================================
+ Hits          435      548     +113     
- Misses       1232     1467     +235     
Flag Coverage Δ
sanity 27.19% <27.09%> (+1.10%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.


🚨 Try these New Features:

Copy link
Collaborator

@anna-savina anna-savina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@mikemorency mikemorency force-pushed the mm-feature/migrate-cluster-info branch from cdb2d81 to 5313bde Compare November 11, 2024 16:11
for host in self.cluster.host:
hosts.append({
'name': host.name,
'folder': get_folder_path_of_vm(host),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function is called get_folder_path_of_vm but it gets an host.... why?

Copy link
Collaborator Author

@mikemorency mikemorency Nov 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think when it was copied it just looked up vm paths but now its more generic.
I updated the name to get_folder_path_of_vsphere_object

try:
output_facts["dpm_host_power_action_rate"] = 6 - int(dpm_config.hostPowerActionRate)
except (TypeError, AttributeError):
output_facts["dpm_host_power_action_rate"] = 3
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you use consts here? (for the values 3 as a default and for 6)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made a constant for the default value, and a static method for the adjustment (6 - whatever).
I updated the drs and dpm modules to use those instead of having their own copies

datacenter_name: datacenter
register: _out

- name: Gather Specific Properties About a Cluster
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If there are multiple datacenters what will it return? the first one? please explain
I'm not sure why to use it without a DC name...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If there are multiple datacenters and the user doesnt supply a datacenter name, this will search for clusters in any datacenter.
If the user does supply a name, each datacenter must have a unique name (per VMware) so there can only ever be one.

The community version has the datacenter as optional. I guess you might want to gather info on every cluster you have, even if you have multiple datacenters?

- name: Check DRS Settings Were Applied
ansible.builtin.assert:
that:
- _idempotence_check is not changed
- _config.drs_default_vm_behavior == drs_default_vm_behavior
- _config.drs_enable_vm_behavior_overrides == drs_enable_vm_behavior_overrides
- _config.drs_vmotion_rate == (6 - drs_vmotion_rate)
- _config.enabled_drs == drs_enable
- _config.drs_vmotion_rate == drs_vmotion_rate
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why did you remove the calculation? I think it was a bug in the past the @shellymiron fixed

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

its no longer needed, the new cluster_info module does the calculation before returning the value

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants