Round-tripping a VM definition using PowerShell 5.1 in Calm results in HTTP error 400 | Nutanix Community
Skip to main content

Hi all!

This might not be a Nutanix Calm specific problem, but since I’m experiencing the problem within Calm I’m posting a question here (as well). :)

I'm grabbing a VM definition using GET https://${PCHostAddress}:9440/api/nutanix/v3/vms/$VMUUID. I convert the payload to an object using ConvertFrom-Json, manipulate it using .Remove() to remove the Status section (among other things), and convert it back to JSON with ConvertTo-Json.

When I send the modified payload back using PUT https://${PCHostAddress}:9440/api/nutanix/v3/vms/$VMUUID, I get a "(400) Bad Request." Thinking I screwed something up when modifying the payload, I removed the part of the code that changes the object but still got the same error, even though I'm "just" converting the unmodified payload from and to JSON.

Sending back the payload without converting from and to JSON results in a "(422) UNPROCESSABLE ENTITY”, as it should.

In this case the -Depth parameter isn't the culprit as far as I can tell (I've added -Depth 100 just to be sure). I also output the payload before the ConvertFrom-Json and after the ConvertTo-Json operations, and the output looks identical on screen.

If I copy the output after the modifications and conversions from and to JSON and then paste that as the body in a Postman call - where all the other parameters are the same as in my code - the API call works as expected.

Checking the aplos.out logs on the Prism Central server only contains this error: "BadRequest: 400 Bad Request: Failed to decode JSON object: 'utf8' codec can't decode byte 0xf6 in position 2976: invalid start byte"

This makes me think that the conversion changes the encoding of the string but I can't for the life of me figure out how that would be possible.

Again, this might be a PowerShell 5.1 specific error but maybe this is something that someone else here might’ve stumbled upon previously.

Hi Martin,

I’m not very familiar with PoSH, but let see if I can help somehow. 

When I see this strange behaviours where I see it works from one place (Postman in your case), and it doesn’t from another one (PoSH) and the payload looks good when printing it out, then I move to an API mockup interceptor to see what the system is receiving. For this sort of troubleshooting I use https://beeceptor.com/, there are others, but this is simple and straightforward. Create an endpoint and test your API call from PoSH against it and you’ll see what is being received so you can compare if PoSH is adding/modifying something in the payload sent.


Hi Jose!

Thanks, I didn’t think of that option but I’ll absolutely give it a try and report back.


After using the Beeceptor mock API endpoint I found that PSH changes character formatting to “true” UTF-8, in this case it took an ‘ö’ and changed it to \u00f6, something that the V3 API can’t handle.

I can’t say if this is a bug or not as I’m not sure if the V3 API is meant to handle UTF-8 or not, or at least UTF-8 formatted this way.

Is there a way to get the V3 API to understand this formatting?


Hi Martin,

Send the call without UTF-8. I just did a test and it worked for me. Here a short snippet (the VM name includes ö)

 

curl -X PUT --header "Content-Type: application/json" --header "Accept: application/json" -d "{
\"spec\": {
\"name\": \"MöSQL-211221-163006\",
\"resources\": {
\"num_threads_per_core\": 1,
\"vnuma_config\": {
\"num_vnuma_nodes\": 0
...

 


Hi Jose.

That would work but the problem is that the API gives me the definition like this:

"subnet_reference":  {
"kind": "subnet",
"name": "HCS T2HC Testkund - Milj� A",
"uuid": "66e4f9c5-b436-4408-956f-958e02a18f9c"
}

I can get rid of the unicode by using $body =  =regex]::Unescape($($VM | ConvertTo-Json -Depth 100)).

The result of this is as far as I can tell the exact same JSON as the API give me, but if I send that back to the API it still freaks out. So, my next task is to find a way to get rid of �. :)


Hi Martin,

Are you sending the GET Request with UTF-8? If I do a get of the call I shared, I’m getting the VM name back ok. If you are doing the GET with UTF-8, remove it too. 


Hi.

I didn’t actually know you could specify the encoding for an API call. Where is that done?


Hi Martin,

Here an example of a working call. Make sure you keep the unicode from the GET call AS-IS, and then when sending the PUT, you set the Content-Type as the example. Mine uses ö in the VM name as well the Subnet name.

curl 'https://PC_IP:9440/api/nutanix/v3/vms/VM_UUID' \
--insecure \
-u admin:nutanix/4u \
-X 'PUT' \
-H 'accept: application/json' \
-H 'content-type: application/json' \
--data '{
"spec": {
"cluster_reference": {
"kind": "cluster",
"name": "Durham_AOS",
"uuid": "ff7e6b2e-8238-459e-8a43-6f30c95e7efe"
},
"resources": {
"num_threads_per_core": 2,
"vnuma_config": {
"num_vnuma_nodes": 0
},
"serial_port_list": ],
"nic_list":
{
"nic_type": "NORMAL_NIC",
"uuid": "762e4611-96d0-47e1-8c71-4e25847a9cd7",
"ip_endpoint_list": ],
"vlan_mode": "ACCESS",
"mac_address": "50:6b:8d:fb:60:b5",
"subnet_reference": {
"kind": "subnet",
"name": "HCS T2HC Testkund - Milj\u00f6 A",
"uuid": "59cee28b-28d5-4ba5-a0a2-9ce3d16a64fc"
},
"is_connected": true,
"trunked_vlan_list": ]
}
],
"num_vcpus_per_socket": 1,
"num_sockets": 1,
"gpu_list": ],
"hardware_virtualization_enabled": false,
"is_agent_vm": false,
"memory_size_mib": 4096,
"boot_config": {
"boot_device_order_list":
"CDROM",
"DISK",
"NETWORK"
],
"boot_type": "LEGACY"
},
"hardware_clock_timezone": "UTC",
"power_state_mechanism": {
"guest_transition_config": {
"should_fail_on_script_failure": false,
"enable_script_exec": false
},
"mechanism": "HARD"
},
"power_state": "OFF",
"machine_type": "PC",
"vga_console_enabled": true,
"disk_list":
{
"storage_config": {
"storage_container_reference": {
"kind": "storage_container",
"uuid": "a8176b46-51c4-4db1-b3e3-d42f97d8dbbb",
"name": "default-container-999"
}
},
"device_properties": {
"disk_address": {
"device_index": 0,
"adapter_type": "SCSI"
},
"device_type": "DISK"
},
"uuid": "659aa2fa-9c88-4780-be50-c87b34dd3686",
"disk_size_bytes": 1073741824,
"disk_size_mib": 1024
}
]
},
"name": "test\u00f6"
},
"api_version": "3.1",
"metadata": {
"last_update_time": "2021-12-27T14:04:49Z",
"kind": "vm",
"uuid": "2befeb0f-d301-439e-8482-477da9e8b265",
"project_reference": {
"kind": "project",
"name": "default",
"uuid": "a925cde2-0676-4436-b6d6-61618308a0f0"
},
"creation_time": "2021-12-27T14:04:49Z",
"spec_version": 2,
"categories_mapping": {},
"entity_version": "2",
"owner_reference": {
"kind": "user",
"name": "admin",
"uuid": "00000000-0000-0000-0000-000000000000"
},
"categories": {}
}
}'

 


Thanks! I'll give this a try ASAP (I’m neck deep in another API now).


Hi Jose.

It seems as if I confused the before and after results in one of my posts above. The API does return the definition correctly escaped with Unicode, and it’s the PSH conversion from and to JSON than adds the “garbage character” - not the other way around.

In other words, the problem isn’t how I send the update to the API, it’s that PSH (5.1  at least) doesn’t know what to do with Unicode when converting to JSON.

I’ll do some more testing and then update this thread with some sort of answer.

A big thanks for all the help! :)


After a lot of attempts at workarounds and outright hacks I decided that the only sustainable way to solve this issue is to use a newer version of PowerShell (in our case version 7.2 since that's what’s available now).

From version 6.2 of PowerShell the ConvertTo-Json cmdlet accepts the -EscapeHandling parameter which when set to EscapeNonAscii converts non-ASCII characters back to Unicode.

Not the solution I was hoping for as I now have to get Calm to use PSH 7, but that’s another topic. :)