This job view page is being replaced by Spyglass soon. Check out the new job view.
PRtedyu: Don't try to create VolumeSpec immediately after underlying PVC is being deleted
ResultFAILURE
Tests 1 failed / 2610 succeeded
Started2020-01-14 17:25
Elapsed25m54s
Revision30a96d8cf6aec7a2ca117d8037bb0e65aaec5750
Refs 86670

Test Failures


k8s.io/kubernetes/test/integration/client TestDynamicClient 7.09s

go test -v k8s.io/kubernetes/test/integration/client -run TestDynamicClient$
=== RUN   TestDynamicClient
I0114 17:41:51.221839  106273 controller.go:123] Shutting down OpenAPI controller
I0114 17:41:51.221859  106273 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0114 17:41:51.221875  106273 establishing_controller.go:85] Shutting down EstablishingController
I0114 17:41:51.221889  106273 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
I0114 17:41:51.221906  106273 customresource_discovery_controller.go:220] Shutting down DiscoveryController
I0114 17:41:51.221919  106273 naming_controller.go:300] Shutting down NamingConditionController
I0114 17:41:51.221933  106273 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0114 17:41:51.221954  106273 apiservice_controller.go:106] Shutting down APIServiceRegistrationController
I0114 17:41:51.221967  106273 available_controller.go:398] Shutting down AvailableConditionController
I0114 17:41:51.221989  106273 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0114 17:41:51.222004  106273 crd_finalizer.go:276] Shutting down CRDFinalizer
I0114 17:41:51.222021  106273 autoregister_controller.go:164] Shutting down autoregister controller
I0114 17:41:51.222163  106273 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver723971254/proxy-ca.crt
I0114 17:41:51.222183  106273 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver723971254/client-ca.crt
I0114 17:41:51.222367  106273 controller.go:87] Shutting down OpenAPI AggregationController
I0114 17:41:51.222387  106273 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0114 17:41:51.222477  106273 secure_serving.go:222] Stopped listening on 127.0.0.1:44607
I0114 17:41:51.222496  106273 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver723971254/apiserver.crt::/tmp/kubernetes-kube-apiserver723971254/apiserver.key
I0114 17:41:51.222524  106273 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver723971254/client-ca.crt
I0114 17:41:52.060830  106273 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver226458781/apiserver.crt, /tmp/kubernetes-kube-apiserver226458781/apiserver.key)
I0114 17:41:52.060848  106273 server.go:596] external host was not specified, using 127.0.0.1
W0114 17:41:52.060859  106273 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0114 17:41:52.728381  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.728419  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.728434  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.728624  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.729445  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.729481  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.729511  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.729538  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.729749  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.729905  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.729949  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 17:41:52.730021  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 17:41:52.730047  106273 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 17:41:52.730056  106273 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 17:41:52.731655  106273 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 17:41:52.731681  106273 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 17:41:52.733046  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.733086  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.734229  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.734287  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 17:41:52.767476  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 17:41:52.768651  106273 master.go:264] Using reconciler: lease
I0114 17:41:52.768959  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.769002  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.772841  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.772933  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.774526  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.774601  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.775682  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.775721  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.777450  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.777491  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.778834  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.778874  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.780007  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.780044  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.783177  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.783225  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.785441  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.785584  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.786855  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.786894  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.788003  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.788054  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.788894  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.788923  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.790645  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.790680  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.791839  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.791875  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.796917  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.796960  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.798075  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.798111  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.799243  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.799359  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.800670  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.800706  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.801414  106273 rest.go:113] the default service ipfamily for this cluster is: IPv4
I0114 17:41:52.966262  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.966316  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.967698  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.967744  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.969186  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.969218  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.973139  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.973183  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.975942  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.975982  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.978723  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.978763  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.979794  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.979831  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.980860  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.980893  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.982545  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.982627  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.983802  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.983837  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.984908  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.984937  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.986412  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.986447  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.987487  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.987519  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.988743  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.988769  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.991158  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.991181  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.992718  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.992753  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.993820  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.993852  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.996145  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.996187  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:52.997983  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:52.998016  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.000032  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.000208  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.001859  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.001898  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.003051  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.003086  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.005576  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.005615  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.006895  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.006954  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.008866  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.008901  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.010088  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.010231  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.011307  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.011346  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.012536  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.012577  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.013398  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.013427  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.015201  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.015233  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.018600  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.018922  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.041705  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.041756  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.049561  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.049612  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.053128  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.053169  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.062273  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.062319  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.071774  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.071818  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.078273  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.078335  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.080396  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.080434  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.082019  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.082053  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.085576  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.085617  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.087371  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.087427  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.089139  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.089170  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.093734  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.093782  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.099083  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.099126  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.119960  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.120012  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.121342  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.121383  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.122473  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.122514  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.123895  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.123929  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.125430  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.125472  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.126906  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.126946  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.128281  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.128317  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.131968  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.132011  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.135626  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.135743  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.138377  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.138420  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 17:41:53.369640  106273 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
W0114 17:41:53.547664  106273 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0114 17:41:53.547706  106273 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0114 17:41:53.570433  106273 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 17:41:53.570542  106273 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
W0114 17:41:53.572583  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 17:41:53.572858  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.572981  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:41:53.574537  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.574571  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 17:41:53.578399  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 17:41:53.579082  106273 aggregator.go:182] Skipping APIService creation for flowcontrol.apiserver.k8s.io/v1alpha1
I0114 17:41:53.728155  106273 client.go:361] parsed scheme: "endpoint"
I0114 17:41:53.728253  106273 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
E0114 17:41:53.772959  106273 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
W0114 17:41:55.223092  106273 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1beta1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 17:41:55.223152  106273 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Namespace ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 17:41:55.223153  106273 reflector.go:340] k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: watch of *v1.ConfigMap ended with: very short watch: k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: Unexpected watch close - watch lasted less than a second and no items received
W0114 17:41:55.223092  106273 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.LimitRange ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
I0114 17:41:57.125592  106273 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver226458781/proxy-ca.crt
I0114 17:41:57.125678  106273 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver226458781/client-ca.crt
I0114 17:41:57.125874  106273 dynamic_serving_content.go:129] Starting serving-cert::/tmp/kubernetes-kube-apiserver226458781/apiserver.crt::/tmp/kubernetes-kube-apiserver226458781/apiserver.key
I0114 17:41:57.126620  106273 secure_serving.go:178] Serving securely on 127.0.0.1:33437
I0114 17:41:57.126691  106273 controller.go:81] Starting OpenAPI AggregationController
I0114 17:41:57.126809  106273 autoregister_controller.go:140] Starting autoregister controller
I0114 17:41:57.126827  106273 cache.go:32] Waiting for caches to sync for autoregister controller
I0114 17:41:57.126991  106273 crd_finalizer.go:264] Starting CRDFinalizer
I0114 17:41:57.127064  106273 controller.go:86] Starting OpenAPI controller
I0114 17:41:57.127085  106273 customresource_discovery_controller.go:209] Starting DiscoveryController
I0114 17:41:57.127145  106273 naming_controller.go:289] Starting NamingConditionController
I0114 17:41:57.127173  106273 establishing_controller.go:74] Starting EstablishingController
I0114 17:41:57.127189  106273 nonstructuralschema_controller.go:185] Starting NonStructuralSchemaConditionController
I0114 17:41:57.127205  106273 apiapproval_controller.go:184] Starting KubernetesAPIApprovalPolicyConformantConditionController
I0114 17:41:57.127372  106273 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 17:41:57.127384  106273 crdregistration_controller.go:111] Starting crd-autoregister controller
I0114 17:41:57.127396  106273 shared_informer.go:206] Waiting for caches to sync for crd-autoregister
W0114 17:41:57.129774  106273 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 17:41:57.129942  106273 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0114 17:41:57.129951  106273 shared_informer.go:206] Waiting for caches to sync for cluster_authentication_trust_controller
I0114 17:41:57.129979  106273 available_controller.go:386] Starting AvailableConditionController
I0114 17:41:57.129986  106273 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0114 17:41:57.130258  106273 apiservice_controller.go:94] Starting APIServiceRegistrationController
I0114 17:41:57.130292  106273 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0114 17:41:57.131109  106273 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver226458781/client-ca.crt
I0114 17:41:57.131160  106273 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver226458781/proxy-ca.crt
E0114 17:41:57.148577  106273 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /8200b750-a4ab-4fcb-96cd-771eedb4e2ec/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
I0114 17:41:57.227022  106273 cache.go:39] Caches are synced for autoregister controller
I0114 17:41:57.230916  106273 shared_informer.go:213] Caches are synced for crd-autoregister 
I0114 17:41:57.231390  106273 cache.go:39] Caches are synced for AvailableConditionController controller
I0114 17:41:57.231479  106273 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0114 17:41:57.231603  106273 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
E0114 17:41:57.279355  106273 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 17:41:57.296392  106273 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 17:41:57.307442  106273 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0114 17:41:58.125697  106273 controller.go:107] OpenAPI AggregationController: Processing item 
I0114 17:41:58.125751  106273 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0114 17:41:58.125812  106273 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0114 17:41:58.134570  106273 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I0114 17:41:58.142694  106273 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I0114 17:41:58.142743  106273 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
W0114 17:41:58.216910  106273 lease.go:224] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E0114 17:41:58.219548  106273 controller.go:222] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W0114 17:41:58.301742  106273 cacher.go:162] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0114 17:41:58.302112  106273 cacher.go:162] Terminating all watchers from cacher *core.LimitRange
W0114 17:41:58.302369  106273 cacher.go:162] Terminating all watchers from cacher *core.ResourceQuota
W0114 17:41:58.302596  106273 cacher.go:162] Terminating all watchers from cacher *core.Secret
W0114 17:41:58.303014  106273 cacher.go:162] Terminating all watchers from cacher *core.ConfigMap
W0114 17:41:58.303319  106273 cacher.go:162] Terminating all watchers from cacher *core.Namespace
W0114 17:41:58.303389  106273 cacher.go:162] Terminating all watchers from cacher *core.Endpoints
W0114 17:41:58.303650  106273 cacher.go:162] Terminating all watchers from cacher *core.Pod
W0114 17:41:58.303747  106273 cacher.go:162] Terminating all watchers from cacher *core.ServiceAccount
W0114 17:41:58.303982  106273 cacher.go:162] Terminating all watchers from cacher *core.Service
W0114 17:41:58.305797  106273 cacher.go:162] Terminating all watchers from cacher *node.RuntimeClass
W0114 17:41:58.307425  106273 cacher.go:162] Terminating all watchers from cacher *scheduling.PriorityClass
W0114 17:41:58.308001  106273 cacher.go:162] Terminating all watchers from cacher *storage.StorageClass
W0114 17:41:58.309438  106273 cacher.go:162] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0114 17:41:58.310428  106273 cacher.go:162] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0114 17:41:58.311420  106273 cacher.go:162] Terminating all watchers from cacher *apiregistration.APIService
--- FAIL: TestDynamicClient (7.09s)
    testserver.go:181: runtime-config=map[api/all:true]
    testserver.go:182: Starting kube-apiserver on port 33437...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testhsfjw", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testhsfjw", UID:"41f4187a-17ef-439d-a338-5d98abc78237", ResourceVersion:"8258", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714620518, loc:(*time.Location)(0x753ec80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc0642fd7c0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0642fd7e0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc063db5218), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0642f8e40), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc063db5240)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc063db5260)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc063db5268), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc063db526c), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testhsfjw", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testhsfjw", UID:"41f4187a-17ef-439d-a338-5d98abc78237", ResourceVersion:"8258", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714620518, loc:(*time.Location)(0x753ec80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc064875300), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0648752e0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc0649465e8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc064859e00), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc064946630)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc064946650)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc0649465c8), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc0649465a9), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}

				from junit_20200114-173933.xml

Find in mentions in log files | View test history on testgrid


Show 2610 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 56 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [0114 17:29:33] Call tree:
!!! [0114 17:29:33]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0114 17:29:33]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0114 17:29:33]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [0114 17:29:33]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [0114 17:29:33]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0114 17:29:33] Running kubeadm tests
+++ [0114 17:29:38] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [0114 17:30:23] Running tests without code coverage
{"Time":"2020-01-14T17:31:44.650128888Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t43.319s\n"}
✓  cmd/kubeadm/test/cmd (43.319s)
... skipping 302 lines ...
+++ [0114 17:33:33] Building kube-controller-manager
+++ [0114 17:33:38] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [0114 17:34:08] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I0114 17:34:09.509670   54630 serving.go:313] Generated self-signed cert in-memory
W0114 17:34:10.485420   54630 authentication.go:409] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 17:34:10.485468   54630 authentication.go:267] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0114 17:34:10.485479   54630 authentication.go:291] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0114 17:34:10.485497   54630 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 17:34:10.485512   54630 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0114 17:34:10.485540   54630 controllermanager.go:161] Version: v1.18.0-alpha.1.679+13ea65e4dae73f
I0114 17:34:10.486658   54630 secure_serving.go:178] Serving securely on [::]:10257
I0114 17:34:10.486693   54630 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 17:34:10.487111   54630 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0114 17:34:10.487183   54630 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 134 lines ...
I0114 17:34:11.022230   54630 controllermanager.go:533] Started "persistentvolume-expander"
I0114 17:34:11.022365   54630 pv_controller_base.go:294] Starting persistent volume controller
I0114 17:34:11.022384   54630 shared_informer.go:206] Waiting for caches to sync for persistent volume
I0114 17:34:11.022422   54630 expand_controller.go:319] Starting expand controller
I0114 17:34:11.022443   54630 shared_informer.go:206] Waiting for caches to sync for expand
I0114 17:34:11.022956   54630 node_lifecycle_controller.go:77] Sending events to api server
E0114 17:34:11.023004   54630 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0114 17:34:11.023015   54630 controllermanager.go:525] Skipping "cloud-node-lifecycle"
W0114 17:34:11.024338   54630 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 17:34:11.024439   54630 clusterroleaggregation_controller.go:148] Starting ClusterRoleAggregator
I0114 17:34:11.024464   54630 shared_informer.go:206] Waiting for caches to sync for ClusterRoleAggregator
I0114 17:34:11.024502   54630 controllermanager.go:533] Started "clusterrole-aggregation"
W0114 17:34:11.031812   54630 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
... skipping 26 lines ...
I0114 17:34:11.539634   54630 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0114 17:34:11.539664   54630 controllermanager.go:533] Started "garbagecollector"
I0114 17:34:11.539710   54630 graph_builder.go:282] GraphBuilder running
I0114 17:34:11.541361   54630 controllermanager.go:533] Started "disruption"
I0114 17:34:11.541918   54630 disruption.go:330] Starting disruption controller
I0114 17:34:11.541933   54630 shared_informer.go:206] Waiting for caches to sync for disruption
E0114 17:34:11.542872   54630 core.go:90] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0114 17:34:11.542906   54630 controllermanager.go:525] Skipping "service"
{
  "major": "1",
  "minor": "18+",
  "gitVersion": "v1.18.0-alpha.1.679+13ea65e4dae73f",
  "gitCommit": "13ea65e4dae73f7fb9735ac8b4549fa95f25a801",
  "gitTreeState": "clean",
  "buildDate": "2020-01-14T15:09:19Z",
  "goVersion": "go1.13.5",
  "compiler": "gc",
  "platform": "linux/amd64"
}W0114 17:34:11.584042   54630 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I0114 17:34:11.615158   54630 shared_informer.go:213] Caches are synced for deployment 
I0114 17:34:11.615751   54630 shared_informer.go:213] Caches are synced for endpoint 
I0114 17:34:11.617801   54630 shared_informer.go:213] Caches are synced for HPA 
I0114 17:34:11.622673   54630 shared_informer.go:213] Caches are synced for expand 
I0114 17:34:11.622675   54630 shared_informer.go:213] Caches are synced for persistent volume 
I0114 17:34:11.628216   54630 shared_informer.go:213] Caches are synced for ReplicationController 
... skipping 2 lines ...
I0114 17:34:11.628339   54630 shared_informer.go:213] Caches are synced for TTL 
I0114 17:34:11.628361   54630 shared_informer.go:213] Caches are synced for ReplicaSet 
I0114 17:34:11.628530   54630 shared_informer.go:213] Caches are synced for PVC protection 
I0114 17:34:11.632561   54630 shared_informer.go:213] Caches are synced for service account 
I0114 17:34:11.634696   54630 shared_informer.go:213] Caches are synced for namespace 
I0114 17:34:11.634924   51184 controller.go:606] quota admission added evaluator for: serviceaccounts
E0114 17:34:11.639959   54630 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
E0114 17:34:11.646949   54630 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
E0114 17:34:11.649230   54630 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
I0114 17:34:11.656819   54630 shared_informer.go:213] Caches are synced for GC 
I0114 17:34:11.657184   54630 shared_informer.go:213] Caches are synced for job 
E0114 17:34:11.657709   54630 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
I0114 17:34:11.657914   54630 shared_informer.go:213] Caches are synced for PV protection 
I0114 17:34:11.714627   54630 shared_informer.go:213] Caches are synced for attach detach 
+++ [0114 17:34:11] Testing kubectl version: check client only output matches expected output
Successful: the flag '--client' shows correct client info
(BI0114 17:34:11.912472   54630 shared_informer.go:213] Caches are synced for taint 
I0114 17:34:11.912574   54630 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
... skipping 68 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0114 17:34:15] Creating namespace namespace-1579023255-25274
namespace/namespace-1579023255-25274 created
Context "test" modified.
+++ [0114 17:34:15] Testing RESTMapper
+++ [0114 17:34:15] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 601 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0114 17:34:55] "kubectl patch with resourceVersion 526" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
node/node-v1-test created
W0114 17:34:56.892960   54630 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0114 17:34:56.915430   54630 event.go:278] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"node-v1-test", UID:"1ac92f7b-63fb-4dae-8009-875714791e4a", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node node-v1-test event: Registered Node node-v1-test in Controller
node/node-v1-test replaced
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
... skipping 23 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 86 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0114 17:35:07] Creating namespace namespace-1579023307-21172
namespace/namespace-1579023307-21172 created
Context "test" modified.
+++ [0114 17:35:07] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0114 17:35:07] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I0114 17:35:10.940310   51184 client.go:361] parsed scheme: "endpoint"
I0114 17:35:10.940366   51184 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 17:35:10.944265   51184 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 104 lines ...
Context "test" modified.
+++ [0114 17:35:15] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 18 lines ...
apps.sh:130: Successful get deployments my-depl {{.spec.template.metadata.labels.l1}}: l1
(Bapps.sh:131: Successful get deployments my-depl {{.spec.selector.matchLabels.l1}}: l1
(Bapps.sh:132: Successful get deployments my-depl {{.metadata.labels.l1}}: <no value>
(Bdeployment.apps "my-depl" deleted
replicaset.apps "my-depl-64b97f7d4d" deleted
pod "my-depl-64b97f7d4d-l9f92" deleted
E0114 17:35:17.417884   54630 replica_set.go:534] sync "namespace-1579023315-1245/my-depl-64b97f7d4d" failed with Operation cannot be fulfilled on replicasets.apps "my-depl-64b97f7d4d": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1579023315-1245/my-depl-64b97f7d4d, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 9eff3ed8-8fea-4f7a-8922-51e27abf29e6, UID in object meta: 
apps.sh:138: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:139: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:140: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:144: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0114 17:35:17.993894   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023315-1245", Name:"nginx", UID:"334e61fe-5a23-4b04-834a-d7425259b8b5", APIVersion:"apps/v1", ResourceVersion:"621", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-8484dd655 to 3
I0114 17:35:17.999910   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023315-1245", Name:"nginx-8484dd655", UID:"99cd3ec6-2201-4686-b04a-ec27db21bf4e", APIVersion:"apps/v1", ResourceVersion:"622", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-g9szv
I0114 17:35:18.002622   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023315-1245", Name:"nginx-8484dd655", UID:"99cd3ec6-2201-4686-b04a-ec27db21bf4e", APIVersion:"apps/v1", ResourceVersion:"622", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-crbbt
I0114 17:35:18.004814   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023315-1245", Name:"nginx-8484dd655", UID:"99cd3ec6-2201-4686-b04a-ec27db21bf4e", APIVersion:"apps/v1", ResourceVersion:"622", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-dhmwt
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BI0114 17:35:21.833634   54630 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1579023304-30560
Successful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1579023315-1245\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1579023315-1245"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
deployment.apps/nginx configured
I0114 17:35:27.607037   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023315-1245", Name:"nginx", UID:"64d61947-5165-463c-bc23-9c57f88f2fbd", APIVersion:"apps/v1", ResourceVersion:"663", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0114 17:35:27.610350   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023315-1245", Name:"nginx-668b6c7744", UID:"7e316cb0-367e-4b2d-8e43-15a5bf8b6e09", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-47phg
I0114 17:35:27.614263   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023315-1245", Name:"nginx-668b6c7744", UID:"7e316cb0-367e-4b2d-8e43-15a5bf8b6e09", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-2kfmm
I0114 17:35:27.615731   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023315-1245", Name:"nginx-668b6c7744", UID:"7e316cb0-367e-4b2d-8e43-15a5bf8b6e09", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-tzh5z
Successful
... skipping 169 lines ...
+++ [0114 17:35:35] Creating namespace namespace-1579023335-879
namespace/namespace-1579023335-879 created
Context "test" modified.
+++ [0114 17:35:35] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1579023335-879 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1579023335-879 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0114 17:35:37.237734   65177 loader.go:375] Config loaded from file:  /tmp/tmp.VFlSiaN67c/.kube/config
I0114 17:35:37.239120   65177 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I0114 17:35:37.263014   65177 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 1 milliseconds
I0114 17:35:37.264826   65177 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 479 lines ...
Successful
message:NAME    DATA   AGE
one     0      0s
three   0      0s
two     0      0s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [0114 17:35:43] Creating namespace namespace-1579023343-10495
namespace/namespace-1579023343-10495 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 56 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-01-14T17:35:44Z", "labels":map[string]interface {}{"name":"valid-pod"}, "name":"valid-pod", "namespace":"namespace-1579023343-10495", "resourceVersion":"748", "selfLink":"/api/v1/namespaces/namespace-1579023343-10495/pods/valid-pod", "uid":"1485a6f8-c451-4c41-9fcc-16bd213c1792"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-01-14T17:35:44Z","labels":{"name":"valid-pod"},"name":"valid-pod","namespace":"namespace-1579023343-10495","resourceVersion":"748","selfLink":"/api/v1/namespaces/namespace-1579023343-10495/pods/valid-pod","uid":"1485a6f8-c451-4c41-9fcc-16bd213c1792"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-01-14T17:35:44Z labels:map[name:valid-pod] name:valid-pod namespace:namespace-1579023343-10495 resourceVersion:748 selfLink:/api/v1/namespaces/namespace-1579023343-10495/pods/valid-pod uid:1485a6f8-c451-4c41-9fcc-16bd213c1792] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 45 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 42 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [0114 17:35:50] Creating namespace namespace-1579023350-24616
namespace/namespace-1579023350-24616 created
Context "test" modified.
+++ [0114 17:35:50] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [0114 17:35:50] Creating namespace namespace-1579023350-6318
namespace/namespace-1579023350-6318 created
Context "test" modified.
+++ [0114 17:35:51] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0114 17:35:51.779299   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023350-6318", Name:"frontend", UID:"d69cf41b-7a70-4487-9d4a-6425625444c0", APIVersion:"apps/v1", ResourceVersion:"805", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hjmhc
I0114 17:35:51.781909   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023350-6318", Name:"frontend", UID:"d69cf41b-7a70-4487-9d4a-6425625444c0", APIVersion:"apps/v1", ResourceVersion:"805", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9kr89
I0114 17:35:51.784945   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023350-6318", Name:"frontend", UID:"d69cf41b-7a70-4487-9d4a-6425625444c0", APIVersion:"apps/v1", ResourceVersion:"805", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lwpc6
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-9kr89 does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-9kr89 does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"1b3649b5-f914-495d-ab0f-60b02d14fb0a","resourceVersion":"825","creationTimestamp":"2020-01-14T17:35:53Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"1b3649b5-f914-495d-ab0f-60b02d14fb0a","resourceVersion":"829","creationTimestamp":"2020-01-14T17:35:53Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"1b3649b5-f914-495d-ab0f-60b02d14fb0a"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 110 lines ...
valid-pod   0/1     Pending   0          0s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 240 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0114 17:36:04] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 300 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I0114 17:36:27.825956   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023387-19410", Name:"test1-6cdffdb5b8", UID:"7d2217c6-f2e9-4354-89b3-89d5d04b120c", APIVersion:"apps/v1", ResourceVersion:"998", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-5cz2m
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
W0114 17:36:28.072300   51184 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 17:36:28.073605   54630 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 17:36:28] Testing recursive resources
+++ [0114 17:36:28] Creating namespace namespace-1579023388-31650
namespace/namespace-1579023388-31650 created
W0114 17:36:28.179494   51184 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 17:36:28.180406   54630 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
W0114 17:36:28.320685   51184 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 17:36:28.321874   54630 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW0114 17:36:28.428002   51184 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 17:36:28.429225   54630 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 17:36:29.074862   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:29.181769   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:29.323164   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E0114 17:36:29.430468   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:Name:         busybox0
Namespace:    namespace-1579023388-31650
Priority:     0
Node:         <none>
... skipping 159 lines ...
has:Object 'Kind' is missing
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 17:36:30.076105   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:30.183106   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:30.324413   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:30.431641   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0114 17:36:30.788208   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023388-31650", Name:"nginx", UID:"e9f026d8-9997-4869-bde9-f6fd750477d8", APIVersion:"apps/v1", ResourceVersion:"1020", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 17:36:30.792491   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023388-31650", Name:"nginx-f87d999f7", UID:"aed30809-f36f-4683-aba8-1b41c3103cea", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-zgqdc
I0114 17:36:30.795419   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023388-31650", Name:"nginx-f87d999f7", UID:"aed30809-f36f-4683-aba8-1b41c3103cea", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-t7lk5
I0114 17:36:30.797839   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023388-31650", Name:"nginx-f87d999f7", UID:"aed30809-f36f-4683-aba8-1b41c3103cea", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-ktpwc
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bgeneric-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 17:36:31.077451   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
E0114 17:36:31.184870   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
(BSuccessful
message:apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 33 lines ...
      schedulerName: default-scheduler
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:extensions/v1beta1
deployment.apps "nginx" deleted
E0114 17:36:31.325674   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:31.432893   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI0114 17:36:31.774753   54630 namespace_controller.go:185] Namespace has been deleted non-native-resources
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0114 17:36:32.078685   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 17:36:32.186157   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0114 17:36:32.327177   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 17:36:32.434037   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0114 17:36:33.027437   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023388-31650", Name:"busybox0", UID:"de84b96b-32c5-44a3-adde-16edd40798d1", APIVersion:"v1", ResourceVersion:"1052", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-vhzbf
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 17:36:33.034114   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023388-31650", Name:"busybox1", UID:"bc0b6f3b-79b7-4a05-a154-df7135344313", APIVersion:"v1", ResourceVersion:"1054", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-bdwz5
E0114 17:36:33.080068   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:33.187575   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:33.328377   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 17:36:33.435292   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:34.081225   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 17:36:34.188787   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0114 17:36:34.329581   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:34.436541   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0114 17:36:34.975337   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023388-31650", Name:"busybox0", UID:"de84b96b-32c5-44a3-adde-16edd40798d1", APIVersion:"v1", ResourceVersion:"1075", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-g565h
I0114 17:36:34.986401   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023388-31650", Name:"busybox1", UID:"bc0b6f3b-79b7-4a05-a154-df7135344313", APIVersion:"v1", ResourceVersion:"1079", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-hkvr4
E0114 17:36:35.082483   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(BE0114 17:36:35.189982   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:35.331093   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:35.437688   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx1-deployment created
I0114 17:36:35.780799   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023388-31650", Name:"nginx1-deployment", UID:"593afc0f-857c-46b9-b1c3-1115aab86f25", APIVersion:"apps/v1", ResourceVersion:"1098", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 17:36:35.783801   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023388-31650", Name:"nginx1-deployment-7bdbbfb5cf", UID:"3d090b22-1f86-4b03-9686-b005bfbd880b", APIVersion:"apps/v1", ResourceVersion:"1099", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-7b6kw
I0114 17:36:35.788806   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023388-31650", Name:"nginx0-deployment", UID:"4bb326c5-57c0-44c1-abab-c0a2c21da508", APIVersion:"apps/v1", ResourceVersion:"1100", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I0114 17:36:35.788848   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023388-31650", Name:"nginx1-deployment-7bdbbfb5cf", UID:"3d090b22-1f86-4b03-9686-b005bfbd880b", APIVersion:"apps/v1", ResourceVersion:"1099", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-rr958
I0114 17:36:35.793135   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023388-31650", Name:"nginx0-deployment-57c6bff7f6", UID:"78596658-06d6-435f-bb95-c387ed8ec779", APIVersion:"apps/v1", ResourceVersion:"1105", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-tt622
I0114 17:36:35.799956   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023388-31650", Name:"nginx0-deployment-57c6bff7f6", UID:"78596658-06d6-435f-bb95-c387ed8ec779", APIVersion:"apps/v1", ResourceVersion:"1105", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-xt5rw
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BE0114 17:36:36.083727   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:36.191212   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
E0114 17:36:36.332270   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:36.438847   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment resumed
deployment.apps/nginx0-deployment resumed
... skipping 7 lines ...
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0114 17:36:37.084995   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:37.192545   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:37.333753   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:37.439818   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:36:38.086453   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:38.194050   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/busybox0 created
I0114 17:36:38.219167   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023388-31650", Name:"busybox0", UID:"9e520145-e0f7-4fc5-ae38-c2f9fc8738e3", APIVersion:"v1", ResourceVersion:"1148", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-fwsg7
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 17:36:38.223701   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023388-31650", Name:"busybox1", UID:"0413cd28-06c0-4036-b652-2765e899757f", APIVersion:"v1", ResourceVersion:"1150", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-wgmhp
E0114 17:36:38.334973   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 17:36:38.441245   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:no rollbacker has been implemented for "ReplicationController"
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0114 17:36:39.087854   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:39.195428   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:39.336376   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:39.442320   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0114 17:36:39] Testing kubectl(v1:namespaces)
namespace/my-namespace created
E0114 17:36:40.089625   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1314: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
E0114 17:36:40.197004   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:40.337940   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:40.443920   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:41.091106   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:41.198395   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:41.339393   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:41.445307   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:42.092528   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:42.199818   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:42.340981   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:42.446891   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:43.094039   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:43.201278   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:43.342595   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:43.448384   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:44.095442   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:44.202811   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:44.219251   54630 shared_informer.go:206] Waiting for caches to sync for resource quota
I0114 17:36:44.219298   54630 shared_informer.go:213] Caches are synced for resource quota 
E0114 17:36:44.343910   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:44.449920   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:44.748076   54630 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0114 17:36:44.748132   54630 shared_informer.go:213] Caches are synced for garbage collector 
E0114 17:36:45.096879   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:45.203887   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
E0114 17:36:45.345461   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
E0114 17:36:45.451380   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace created
core.sh:1323: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
... skipping 28 lines ...
namespace "namespace-1579023354-3032" deleted
namespace "namespace-1579023355-31742" deleted
namespace "namespace-1579023357-27493" deleted
namespace "namespace-1579023359-20022" deleted
namespace "namespace-1579023387-19410" deleted
namespace "namespace-1579023388-31650" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1579023252-29208" deleted
... skipping 27 lines ...
namespace "namespace-1579023354-3032" deleted
namespace "namespace-1579023355-31742" deleted
namespace "namespace-1579023357-27493" deleted
namespace "namespace-1579023359-20022" deleted
namespace "namespace-1579023387-19410" deleted
namespace "namespace-1579023388-31650" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
core.sh:1335: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(BE0114 17:36:46.098402   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/other created
E0114 17:36:46.205379   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1339: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1343: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:36:46.346906   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:46.452732   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
core.sh:1347: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1349: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1356: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
E0114 17:36:47.099650   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "valid-pod" force deleted
E0114 17:36:47.206663   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1360: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
E0114 17:36:47.348316   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:47.454022   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:48.100918   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:48.208031   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:48.349641   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:48.455418   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:48.540371   54630 horizontal.go:353] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1579023388-31650
I0114 17:36:48.543901   54630 horizontal.go:353] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1579023388-31650
E0114 17:36:49.102111   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:49.209398   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:49.351066   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:49.456784   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:50.103363   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:50.210765   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:50.351902   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:50.458512   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:51.104714   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:51.213329   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:51.353190   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:51.468475   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:52.106487   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:52.214680   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:52.354027   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_secrets_test
Running command: run_secrets_test
E0114 17:36:52.469753   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_secrets_test 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_secrets_test
+++ [0114 17:36:52] Creating namespace namespace-1579023412-27695
namespace/namespace-1579023412-27695 created
... skipping 36 lines ...
  creationTimestamp: null
  name: test
has not:example.com
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(Bnamespace/test-secrets created
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(BE0114 17:36:53.107559   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:36:53.215717   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 17:36:53.355271   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(BE0114 17:36:53.471070   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(BE0114 17:36:54.109530   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E0114 17:36:54.216908   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
E0114 17:36:54.356534   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 17:36:54.472346   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
secret/test-secret created
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
E0114 17:36:55.110918   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/secret-string-data created
E0114 17:36:55.218040   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BE0114 17:36:55.357908   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:55.400471   54630 namespace_controller.go:185] Namespace has been deleted my-namespace
core.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BE0114 17:36:55.473625   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(Bsecret "secret-string-data" deleted
core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret "test-secret" deleted
I0114 17:36:55.911291   54630 namespace_controller.go:185] Namespace has been deleted kube-node-lease
I0114 17:36:55.940993   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023255-25274
... skipping 3 lines ...
I0114 17:36:55.957243   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023274-9024
I0114 17:36:55.959176   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023275-23669
I0114 17:36:55.971249   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023260-9688
I0114 17:36:55.977178   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023271-28882
I0114 17:36:55.982299   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023276-5192
I0114 17:36:56.011147   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023267-29337
E0114 17:36:56.112374   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:56.140426   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023286-14538
I0114 17:36:56.157439   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023299-21624
I0114 17:36:56.175393   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023303-15394
I0114 17:36:56.179813   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023303-28133
I0114 17:36:56.181802   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023300-12225
I0114 17:36:56.188217   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023287-31535
I0114 17:36:56.218180   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023308-18020
I0114 17:36:56.218305   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023307-21172
E0114 17:36:56.219391   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:56.226125   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023304-30560
I0114 17:36:56.289591   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023311-22721
I0114 17:36:56.357985   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023314-26608
E0114 17:36:56.359351   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:56.391865   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023334-22893
I0114 17:36:56.393898   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023333-335
I0114 17:36:56.413445   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023335-879
I0114 17:36:56.431960   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023315-1245
I0114 17:36:56.436218   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023343-10495
I0114 17:36:56.436281   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023354-13275
I0114 17:36:56.442400   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023350-24616
I0114 17:36:56.446945   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023350-6318
E0114 17:36:56.474979   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:56.484536   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023354-3032
I0114 17:36:56.524935   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023355-31742
I0114 17:36:56.551705   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023357-27493
I0114 17:36:56.555271   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023359-20022
I0114 17:36:56.572669   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023387-19410
I0114 17:36:56.632104   54630 namespace_controller.go:185] Namespace has been deleted namespace-1579023388-31650
E0114 17:36:57.113649   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:57.220526   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:57.360662   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:36:57.390088   54630 namespace_controller.go:185] Namespace has been deleted other
E0114 17:36:57.476192   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:58.114932   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:58.221638   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:58.362074   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:58.477447   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:59.116287   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:59.223068   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:59.363388   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:36:59.478813   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:00.117517   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:00.224425   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:00.364647   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:00.480146   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0114 17:37:01.118642   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
+++ [0114 17:37:01] Creating namespace namespace-1579023421-9644
E0114 17:37:01.226013   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023421-9644 created
Context "test" modified.
+++ [0114 17:37:01] Testing configmaps
E0114 17:37:01.365847   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:01.481515   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-configmap created
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(Bconfigmap "test-configmap" deleted
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(Bnamespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(BE0114 17:37:02.119818   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(BE0114 17:37:02.227537   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
E0114 17:37:02.367177   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-binary-configmap created
E0114 17:37:02.482854   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(Bconfigmap "test-configmap" deleted
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E0114 17:37:03.121017   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:03.228799   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:03.368402   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:03.484067   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:04.122310   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:04.230051   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:04.369784   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:04.485348   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:05.123588   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:05.231374   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:05.371017   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:05.486725   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:37:06.040352   54630 namespace_controller.go:185] Namespace has been deleted test-secrets
E0114 17:37:06.124908   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:06.232590   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:06.372189   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:06.487878   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:07.126337   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:07.233835   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:07.373453   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:07.489183   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:08.127169   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0114 17:37:08.235143   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0114 17:37:08] Creating namespace namespace-1579023428-26318
namespace/namespace-1579023428-26318 created
E0114 17:37:08.374639   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 17:37:08] Testing client config
E0114 17:37:08.490498   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
E0114 17:37:09.128713   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [0114 17:37:09] Creating namespace namespace-1579023429-19044
E0114 17:37:09.236483   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023429-19044 created
E0114 17:37:09.375983   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 17:37:09] Testing service accounts
E0114 17:37:09.491755   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:828: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-service-accounts\" }}found{{end}}{{end}}:: :
(Bnamespace/test-service-accounts created
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(Bserviceaccount/test-service-account created
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E0114 17:37:10.129881   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:10.237928   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:10.377191   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:10.493003   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:11.131218   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:11.239276   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:11.378497   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:11.494435   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:12.132469   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:12.240547   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:12.379784   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:12.496032   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:13.133889   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:37:13.168681   54630 namespace_controller.go:185] Namespace has been deleted test-configmaps
E0114 17:37:13.241907   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:13.380975   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:13.497360   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:14.135228   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:14.243346   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:14.388834   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:14.498636   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:15.136240   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests
E0114 17:37:15.244660   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_job_tests
+++ [0114 17:37:15] Creating namespace namespace-1579023435-27986
namespace/namespace-1579023435-27986 created
E0114 17:37:15.390233   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 17:37:15] Testing job
E0114 17:37:15.499892   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(Bnamespace/test-jobs created
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
(Bkubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/pi created
batch.sh:39: Successful get cronjob/pi --namespace=test-jobs {{.metadata.name}}: pi
... skipping 4 lines ...
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 13 lines ...
    Environment:     <none>
    Mounts:          <none>
  Volumes:           <none>
Last Schedule Time:  <unset>
Active Jobs:         <none>
Events:              <none>
E0114 17:37:16.137505   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:job.batch/test-job
has:job.batch/test-job
E0114 17:37:16.245931   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:48: Successful get jobs {{range.items}}{{.metadata.name}}{{end}}: 
(BI0114 17:37:16.375176   54630 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"3543bc35-4909-4109-a2c1-3f8d83c67e57", APIVersion:"batch/v1", ResourceVersion:"1490", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-jmfnt
job.batch/test-job created
E0114 17:37:16.391173   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:53: Successful get job/test-job --namespace=test-jobs {{.metadata.name}}: test-job
(BE0114 17:37:16.501084   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME       COMPLETIONS   DURATION   AGE
test-job   0/1           0s         0s
Name:           test-job
Namespace:      test-jobs
Selector:       controller-uid=3543bc35-4909-4109-a2c1-3f8d83c67e57
Labels:         controller-uid=3543bc35-4909-4109-a2c1-3f8d83c67e57
                job-name=test-job
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Tue, 14 Jan 2020 17:37:16 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=3543bc35-4909-4109-a2c1-3f8d83c67e57
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 15 lines ...
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  0s    job-controller  Created pod: test-job-jmfnt
job.batch "test-job" deleted
cronjob.batch "pi" deleted
namespace "test-jobs" deleted
E0114 17:37:17.138775   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:17.247186   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:17.392531   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:17.502485   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:18.140149   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:18.248403   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:18.393792   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:18.503693   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:19.141468   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:19.249561   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:19.395062   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:19.504925   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:20.142893   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:37:20.171490   54630 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E0114 17:37:20.251035   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:20.396455   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:20.506295   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:21.144071   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:21.252236   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:21.397826   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:21.507567   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_job_tests
+++ [0114 17:37:22] Creating namespace namespace-1579023442-24176
E0114 17:37:22.145495   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023442-24176 created
E0114 17:37:22.253615   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
I0114 17:37:22.364224   54630 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1579023442-24176", Name:"test-job", UID:"929b0ba4-d812-4fc2-a1be-5d26ec9623d8", APIVersion:"batch/v1", ResourceVersion:"1513", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-ccjs6
job.batch/test-job created
E0114 17:37:22.399652   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(BE0114 17:37:22.508793   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job" deleted
I0114 17:37:22.668625   54630 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1579023442-24176", Name:"test-job-pi", UID:"b84997e0-4d5c-44ba-b509-456b6d0b2226", APIVersion:"batch/v1", ResourceVersion:"1520", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-hnbvp
job.batch/test-job-pi created
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(Bjob.batch "test-job-pi" deleted
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/test-pi created
I0114 17:37:23.067601   54630 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1579023442-24176", Name:"my-pi", UID:"bd912064-2079-49d6-a2b6-6d858c4a2618", APIVersion:"batch/v1", ResourceVersion:"1528", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-kl7rn
job.batch/my-pi created
E0114 17:37:23.146734   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:[perl -Mbignum=bpi -wle print bpi(10)]
has:perl -Mbignum=bpi -wle print bpi(10)
job.batch "my-pi" deleted
E0114 17:37:23.254444   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "test-pi" deleted
+++ exit code: 0
E0114 17:37:23.400927   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_pod_templates_tests
Running command: run_pod_templates_tests

+++ Running case: test-cmd.run_pod_templates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_pod_templates_tests
+++ [0114 17:37:23] Creating namespace namespace-1579023443-31138
E0114 17:37:23.510110   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023443-31138 created
Context "test" modified.
+++ [0114 17:37:23] Testing pod templates
core.sh:1421: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0114 17:37:23.889930   51184 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
core.sh:1425: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BNAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
E0114 17:37:24.147922   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:24.259192   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1433: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
E0114 17:37:24.402302   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1437: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:24.511300   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_service_tests
Running command: run_service_tests

+++ Running case: test-cmd.run_service_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_tests
Context "test" modified.
+++ [0114 17:37:24] Testing kubectl(v1:services)
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 17:37:25.149132   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
matched Port:
matched Endpoints:
... skipping 10 lines ...
IP:                10.0.0.107
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0114 17:37:25.261023   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:866: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 19 lines ...
IP:                10.0.0.107
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
(B
E0114 17:37:25.403683   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:870: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 4 lines ...
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(B
E0114 17:37:25.512425   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
matched Port:
matched Endpoints:
... skipping 131 lines ...
  - port: 6379
    targetPort: 6379
  selector:
    role: padawan
status:
  loadBalancer: {}
E0114 17:37:26.150555   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2020-01-14T17:37:24Z"
  labels:
    app: redis
... skipping 13 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
E0114 17:37:26.262322   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
core.sh:890: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: padawan:
(BE0114 17:37:26.405027   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
E0114 17:37:26.513640   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:894: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BapiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2020-01-14T17:37:24Z"
  labels:
... skipping 14 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BI0114 17:37:27.005444   54630 namespace_controller.go:185] Namespace has been deleted test-jobs
service/redis-master selector updated
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
E0114 17:37:27.151786   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:27.263469   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:911: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 17:37:27.406311   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
E0114 17:37:27.514703   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:918: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:922: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
core.sh:926: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bcore.sh:930: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 17:37:28.153204   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test created
E0114 17:37:28.264737   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:951: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(BE0114 17:37:28.407367   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:28.515705   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test replaced
core.sh:958: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
service "service-v1-test" deleted
core.sh:966: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:970: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 17:37:29.154482   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E0114 17:37:29.265890   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-slave created
E0114 17:37:29.408189   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:29.517346   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:975: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BSuccessful
message:NAME           RSRC
kubernetes     144
redis-master   1567
redis-slave    1570
has:redis-master
core.sh:985: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(Bservice "redis-master" deleted
service "redis-slave" deleted
core.sh:992: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:996: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 17:37:30.155736   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/beep-boop created
E0114 17:37:30.267133   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1000: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bcore.sh:1004: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(BE0114 17:37:30.409483   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "beep-boop" deleted
E0114 17:37:30.518457   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1011: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1015: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bkubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I0114 17:37:30.792070   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"b787e6b5-62a4-4c16-a60a-8c598a30f72f", APIVersion:"apps/v1", ResourceVersion:"1584", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I0114 17:37:30.797672   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"64d4fc5c-e696-468a-a656-373e214101d6", APIVersion:"apps/v1", ResourceVersion:"1585", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-drsmv
I0114 17:37:30.803156   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"64d4fc5c-e696-468a-a656-373e214101d6", APIVersion:"apps/v1", ResourceVersion:"1585", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-5t9k5
service/testmetadata created
deployment.apps/testmetadata created
core.sh:1019: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(Bcore.sh:1020: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(Bservice/exposemetadata exposed
E0114 17:37:31.157020   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1026: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(BE0114 17:37:31.268749   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "exposemetadata" deleted
service "testmetadata" deleted
E0114 17:37:31.410879   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "testmetadata" deleted
+++ exit code: 0
Recording: run_daemonset_tests
Running command: run_daemonset_tests
E0114 17:37:31.519865   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_daemonset_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_tests
+++ [0114 17:37:31] Creating namespace namespace-1579023451-28524
namespace/namespace-1579023451-28524 created
Context "test" modified.
+++ [0114 17:37:31] Testing kubectl(v1:daemonsets)
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0114 17:37:32.012585   51184 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I0114 17:37:32.022307   51184 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0114 17:37:32.158503   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:32.269935   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
E0114 17:37:32.412158   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0114 17:37:32.521247   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind image updated
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(Bdaemonset.apps/bind env updated
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(Bdaemonset.apps/bind resource requirements updated
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(Bdaemonset.apps/bind restarted
E0114 17:37:33.159927   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
(BE0114 17:37:33.271199   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps "bind" deleted
+++ exit code: 0
E0114 17:37:33.413360   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_daemonset_history_tests
Running command: run_daemonset_history_tests

+++ Running case: test-cmd.run_daemonset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_history_tests
+++ [0114 17:37:33] Creating namespace namespace-1579023453-23554
E0114 17:37:33.522449   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023453-23554 created
Context "test" modified.
+++ [0114 17:37:33] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdaemonset.apps/bind created
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1579023453-23554"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind skipped rollback (current template already matches revision 1)
E0114 17:37:34.161235   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 17:37:34.272466   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 17:37:34.414693   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:34.523597   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1579023453-23554"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1579023453-23554"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
... skipping 6 lines ...
    Port:	<none>
    Host Port:	<none>
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
E0114 17:37:35.162462   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:83: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0114 17:37:35.273634   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 17:37:35.415824   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E0114 17:37:35.504418   54630 daemon_controller.go:291] namespace-1579023453-23554/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1579023453-23554", SelfLink:"/apis/apps/v1/namespaces/namespace-1579023453-23554/daemonsets/bind", UID:"144582e6-1280-4544-8617-9e73e2f32ca5", ResourceVersion:"1655", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714620253, loc:(*time.Location)(0x6b23a80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1579023453-23554\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001542b60), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001542bc0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001542be0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001542c80)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001542cc0), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc00221c058), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc001771b00), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001542ce0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc0000e2e48)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc00221c0ac)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
E0114 17:37:35.524861   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind rolled back
E0114 17:37:36.158590   54630 daemon_controller.go:291] namespace-1579023453-23554/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1579023453-23554", SelfLink:"/apis/apps/v1/namespaces/namespace-1579023453-23554/daemonsets/bind", UID:"144582e6-1280-4544-8617-9e73e2f32ca5", ResourceVersion:"1660", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714620253, loc:(*time.Location)(0x6b23a80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1579023453-23554\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc00182e3e0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00182e400)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc00182e420), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00182e500)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc00182e560), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc0031ba868), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0023cc000), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc00182e5e0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc002036270)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc0031ba8bc)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
E0114 17:37:36.167966   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0114 17:37:36.274978   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 17:37:36.417095   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 17:37:36.526255   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
Running command: run_rc_tests

+++ Running case: test-cmd.run_rc_tests 
... skipping 5 lines ...
+++ [0114 17:37:36] Testing kubectl(v1:replicationcontrollers)
core.sh:1052: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0114 17:37:37.107190   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"062bfd40-3a71-4541-9e7f-ed1709aad987", APIVersion:"v1", ResourceVersion:"1668", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-75lvt
I0114 17:37:37.117916   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"062bfd40-3a71-4541-9e7f-ed1709aad987", APIVersion:"v1", ResourceVersion:"1668", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rpvnj
I0114 17:37:37.117958   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"062bfd40-3a71-4541-9e7f-ed1709aad987", APIVersion:"v1", ResourceVersion:"1668", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8jqx8
E0114 17:37:37.169166   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
E0114 17:37:37.276294   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1057: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1061: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:37.418386   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:37.527635   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 17:37:37.590000   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"905794ae-79e8-4015-8539-ce03384d0b32", APIVersion:"v1", ResourceVersion:"1684", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8t97n
I0114 17:37:37.593025   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"905794ae-79e8-4015-8539-ce03384d0b32", APIVersion:"v1", ResourceVersion:"1684", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-z4dk9
I0114 17:37:37.602415   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"905794ae-79e8-4015-8539-ce03384d0b32", APIVersion:"v1", ResourceVersion:"1684", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zj2dc
core.sh:1065: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bmatched Name:
... skipping 9 lines ...
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 4 lines ...
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(B
E0114 17:37:38.170379   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1073: Successful describe
Name:         frontend
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-8t97n
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-z4dk9
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-zj2dc
(B
E0114 17:37:38.277624   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
... skipping 5 lines ...
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-8t97n
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-z4dk9
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-zj2dc
(BE0114 17:37:38.419615   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-8t97n
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-z4dk9
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-zj2dc
(BE0114 17:37:38.528691   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1579023456-17806
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1085: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0114 17:37:38.886781   54630 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1579023456-17806 /api/v1/namespaces/namespace-1579023456-17806/replicationcontrollers/frontend 905794ae-79e8-4015-8539-ce03384d0b32 1695 2 2020-01-14 17:37:37 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-14 17:37:37 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-14 17:37:37 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002ac0df8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 17:37:38.892349   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"905794ae-79e8-4015-8539-ce03384d0b32", APIVersion:"v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-8t97n
core.sh:1089: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1093: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 17:37:39.171641   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: Expected replicas to be 3, was 2
E0114 17:37:39.279480   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1097: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 17:37:39.420886   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1101: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller/frontend scaled
I0114 17:37:39.525095   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"905794ae-79e8-4015-8539-ce03384d0b32", APIVersion:"v1", ResourceVersion:"1701", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9lzc2
E0114 17:37:39.529961   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1105: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1109: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0114 17:37:39.825874   54630 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1579023456-17806 /api/v1/namespaces/namespace-1579023456-17806/replicationcontrollers/frontend 905794ae-79e8-4015-8539-ce03384d0b32 1706 4 2020-01-14 17:37:37 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-14 17:37:37 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-14 17:37:39 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002aeccf8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 17:37:39.832309   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"905794ae-79e8-4015-8539-ce03384d0b32", APIVersion:"v1", ResourceVersion:"1706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-9lzc2
core.sh:1113: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
E0114 17:37:40.172874   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master created
I0114 17:37:40.249912   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-master", UID:"cad1a742-d3fa-44a4-9cb2-76c4d5a5334a", APIVersion:"v1", ResourceVersion:"1719", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-pvnts
E0114 17:37:40.280773   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:40.422224   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0114 17:37:40.451989   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-slave", UID:"fe54f811-2183-4983-a8aa-c1944627447c", APIVersion:"v1", ResourceVersion:"1724", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-m5845
I0114 17:37:40.455516   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-slave", UID:"fe54f811-2183-4983-a8aa-c1944627447c", APIVersion:"v1", ResourceVersion:"1724", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-ldlb5
E0114 17:37:40.531137   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master scaled
I0114 17:37:40.559938   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-master", UID:"cad1a742-d3fa-44a4-9cb2-76c4d5a5334a", APIVersion:"v1", ResourceVersion:"1731", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-v88bk
replicationcontroller/redis-slave scaled
I0114 17:37:40.563117   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-master", UID:"cad1a742-d3fa-44a4-9cb2-76c4d5a5334a", APIVersion:"v1", ResourceVersion:"1731", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-rpkt8
I0114 17:37:40.563157   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-slave", UID:"fe54f811-2183-4983-a8aa-c1944627447c", APIVersion:"v1", ResourceVersion:"1733", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-7lgdr
I0114 17:37:40.565453   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-master", UID:"cad1a742-d3fa-44a4-9cb2-76c4d5a5334a", APIVersion:"v1", ResourceVersion:"1731", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-wr2ht
... skipping 4 lines ...
replicationcontroller "redis-slave" deleted
deployment.apps/nginx-deployment created
I0114 17:37:41.079637   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment", UID:"c1bdd15a-ab64-4fcd-9a0f-1909d06ba555", APIVersion:"apps/v1", ResourceVersion:"1749", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 17:37:41.083768   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"8ee620a0-ac34-46e6-9e66-4d99a4a33b8a", APIVersion:"apps/v1", ResourceVersion:"1750", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-skrbt
I0114 17:37:41.086534   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"8ee620a0-ac34-46e6-9e66-4d99a4a33b8a", APIVersion:"apps/v1", ResourceVersion:"1750", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-wvtkz
I0114 17:37:41.088259   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"8ee620a0-ac34-46e6-9e66-4d99a4a33b8a", APIVersion:"apps/v1", ResourceVersion:"1750", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-pr2ww
E0114 17:37:41.176533   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment scaled
I0114 17:37:41.204552   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment", UID:"c1bdd15a-ab64-4fcd-9a0f-1909d06ba555", APIVersion:"apps/v1", ResourceVersion:"1779", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6986c7bc94 to 1
I0114 17:37:41.216720   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"8ee620a0-ac34-46e6-9e66-4d99a4a33b8a", APIVersion:"apps/v1", ResourceVersion:"1780", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-pr2ww
I0114 17:37:41.217162   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"8ee620a0-ac34-46e6-9e66-4d99a4a33b8a", APIVersion:"apps/v1", ResourceVersion:"1780", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-wvtkz
E0114 17:37:41.282091   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1133: Successful get deployment nginx-deployment {{.spec.replicas}}: 1
(Bdeployment.apps "nginx-deployment" deleted
E0114 17:37:41.423182   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
E0114 17:37:41.532380   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
deployment.apps/nginx-deployment created
I0114 17:37:41.900868   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment", UID:"c0b42aeb-6423-4ad6-9df2-30f6c7244c53", APIVersion:"apps/v1", ResourceVersion:"1805", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 17:37:41.904014   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"2961520e-0791-4232-b9f5-a1b7c67a653c", APIVersion:"apps/v1", ResourceVersion:"1806", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-txjl2
I0114 17:37:41.907609   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"2961520e-0791-4232-b9f5-a1b7c67a653c", APIVersion:"apps/v1", ResourceVersion:"1806", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-4wssc
I0114 17:37:41.936944   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-6986c7bc94", UID:"2961520e-0791-4232-b9f5-a1b7c67a653c", APIVersion:"apps/v1", ResourceVersion:"1806", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-x7r9z
core.sh:1152: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(Bservice/nginx-deployment exposed
E0114 17:37:42.177687   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1156: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(BE0114 17:37:42.283329   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
E0114 17:37:42.424412   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 17:37:42.527735   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"0e424a68-d11e-4e8f-bd67-ceba432f7add", APIVersion:"v1", ResourceVersion:"1833", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-45dqm
I0114 17:37:42.532052   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"0e424a68-d11e-4e8f-bd67-ceba432f7add", APIVersion:"v1", ResourceVersion:"1833", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-czl9h
I0114 17:37:42.532104   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"0e424a68-d11e-4e8f-bd67-ceba432f7add", APIVersion:"v1", ResourceVersion:"1833", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tqvbk
E0114 17:37:42.533517   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1163: Successful get rc frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
core.sh:1167: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
core.sh:1171: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(BE0114 17:37:43.179004   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 17:37:43.284592   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-3 exposed
E0114 17:37:43.425553   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1176: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(Bservice/frontend-4 exposed
E0114 17:37:43.534309   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1180: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice/frontend-5 exposed
core.sh:1184: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
E0114 17:37:44.180425   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
E0114 17:37:44.285832   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
E0114 17:37:44.426873   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
E0114 17:37:44.536001   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/etcd-server exposed
has:etcd-server exposed
core.sh:1214: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(Bcore.sh:1215: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(Bservice "etcd-server" deleted
core.sh:1221: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicationcontroller "frontend" deleted
core.sh:1225: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:45.181994   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1229: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:45.286899   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:45.428191   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 17:37:45.441169   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"b603d180-5a09-4994-b4bf-2965cbc4b42e", APIVersion:"v1", ResourceVersion:"1897", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bwwqz
I0114 17:37:45.443966   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"b603d180-5a09-4994-b4bf-2965cbc4b42e", APIVersion:"v1", ResourceVersion:"1897", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-r44xs
I0114 17:37:45.445975   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"b603d180-5a09-4994-b4bf-2965cbc4b42e", APIVersion:"v1", ResourceVersion:"1897", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jtvzv
E0114 17:37:45.537083   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0114 17:37:45.633688   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-slave", UID:"dc786916-0489-4aa0-bc7b-0ad4ca1333b6", APIVersion:"v1", ResourceVersion:"1906", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-2frgk
I0114 17:37:45.639174   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"redis-slave", UID:"dc786916-0489-4aa0-bc7b-0ad4ca1333b6", APIVersion:"v1", ResourceVersion:"1906", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-25crb
core.sh:1234: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bcore.sh:1238: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
core.sh:1242: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1246: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:46.183308   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:46.288350   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 17:37:46.338960   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"b15e5c33-dfde-403a-ae90-7e7e25c5a98b", APIVersion:"v1", ResourceVersion:"1927", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nqdds
I0114 17:37:46.342415   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"b15e5c33-dfde-403a-ae90-7e7e25c5a98b", APIVersion:"v1", ResourceVersion:"1927", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jqx9c
I0114 17:37:46.343550   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023456-17806", Name:"frontend", UID:"b15e5c33-dfde-403a-ae90-7e7e25c5a98b", APIVersion:"v1", ResourceVersion:"1927", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ks5hz
E0114 17:37:46.429528   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1249: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 17:37:46.538273   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1252: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1256: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
E0114 17:37:47.184495   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:47.289675   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
  labels:
    name: nginx-deployment-resources
... skipping 22 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
E0114 17:37:47.430753   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:47.539610   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources created
I0114 17:37:47.604549   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources", UID:"bffae1ea-eee8-4823-bfbe-a9c07023f1d3", APIVersion:"apps/v1", ResourceVersion:"1947", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I0114 17:37:47.608593   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-67f8cfff5", UID:"ccf6d0a7-670c-4dca-9168-f6e4d07ac02e", APIVersion:"apps/v1", ResourceVersion:"1948", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-qwxgp
I0114 17:37:47.610725   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-67f8cfff5", UID:"ccf6d0a7-670c-4dca-9168-f6e4d07ac02e", APIVersion:"apps/v1", ResourceVersion:"1948", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-lsx87
I0114 17:37:47.615483   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-67f8cfff5", UID:"ccf6d0a7-670c-4dca-9168-f6e4d07ac02e", APIVersion:"apps/v1", ResourceVersion:"1948", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-n8699
core.sh:1271: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1272: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0114 17:37:48.017656   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources", UID:"bffae1ea-eee8-4823-bfbe-a9c07023f1d3", APIVersion:"apps/v1", ResourceVersion:"1963", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I0114 17:37:48.021747   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-55c547f795", UID:"edea2840-9c52-490b-8583-6c1e1d1535dd", APIVersion:"apps/v1", ResourceVersion:"1964", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-k97m9
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(BE0114 17:37:48.185804   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(BE0114 17:37:48.291038   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I0114 17:37:48.408402   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources", UID:"bffae1ea-eee8-4823-bfbe-a9c07023f1d3", APIVersion:"apps/v1", ResourceVersion:"1973", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
I0114 17:37:48.414908   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-67f8cfff5", UID:"ccf6d0a7-670c-4dca-9168-f6e4d07ac02e", APIVersion:"apps/v1", ResourceVersion:"1977", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-qwxgp
I0114 17:37:48.416119   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources", UID:"bffae1ea-eee8-4823-bfbe-a9c07023f1d3", APIVersion:"apps/v1", ResourceVersion:"1975", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I0114 17:37:48.421370   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-6d86564b45", UID:"9aa0121e-92eb-4661-90b2-a4c1088533b3", APIVersion:"apps/v1", ResourceVersion:"1981", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-zz7mp
E0114 17:37:48.432951   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0114 17:37:48.540771   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0114 17:37:48.712313   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources", UID:"bffae1ea-eee8-4823-bfbe-a9c07023f1d3", APIVersion:"apps/v1", ResourceVersion:"1994", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 1
I0114 17:37:48.718032   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources", UID:"bffae1ea-eee8-4823-bfbe-a9c07023f1d3", APIVersion:"apps/v1", ResourceVersion:"1996", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
I0114 17:37:48.719615   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-67f8cfff5", UID:"ccf6d0a7-670c-4dca-9168-f6e4d07ac02e", APIVersion:"apps/v1", ResourceVersion:"1998", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-lsx87
I0114 17:37:48.722862   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023456-17806", Name:"nginx-deployment-resources-6c478d4fdb", UID:"7e03787e-c2d3-4524-8a1a-a92e4adaafe3", APIVersion:"apps/v1", ResourceVersion:"2001", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-8kt45
... skipping 74 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
E0114 17:37:49.187215   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1292: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0114 17:37:49.292304   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1293: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE0114 17:37:49.434446   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1294: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE0114 17:37:49.542041   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment-resources" deleted
+++ exit code: 0
Recording: run_deployment_tests
Running command: run_deployment_tests

+++ Running case: test-cmd.run_deployment_tests 
... skipping 4 lines ...
Context "test" modified.
+++ [0114 17:37:49] Testing deployments
deployment.apps/test-nginx-extensions created
I0114 17:37:50.015089   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"test-nginx-extensions", UID:"92622058-9aa6-4a62-a11a-1d0299bc34e4", APIVersion:"apps/v1", ResourceVersion:"2031", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
I0114 17:37:50.021784   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"test-nginx-extensions-5559c76db7", UID:"0b4a334d-3454-4fe4-96aa-c8229934b0b0", APIVersion:"apps/v1", ResourceVersion:"2032", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-rznxq
apps.sh:185: Successful get deploy test-nginx-extensions {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 17:37:50.188481   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has not:2
E0114 17:37:50.293633   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apps/v1
has:apps/v1
deployment.apps "test-nginx-extensions" deleted
E0114 17:37:50.435613   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-nginx-apps created
I0114 17:37:50.511774   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"test-nginx-apps", UID:"2ce870bb-680b-4e2e-876c-b3d3116e5ce9", APIVersion:"apps/v1", ResourceVersion:"2045", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
I0114 17:37:50.517132   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"test-nginx-apps-79b9bd9585", UID:"c977b4dd-0dc2-41a9-b884-f2d2d0563723", APIVersion:"apps/v1", ResourceVersion:"2046", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-rbjck
E0114 17:37:50.543047   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:198: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BSuccessful
message:10
has:10
Successful
message:apps/v1
... skipping 14 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 34 lines ...
Volumes:          <none>
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
(Bdeployment.apps "test-nginx-apps" deleted
E0114 17:37:51.189561   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:51.294816   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-with-command created
I0114 17:37:51.374566   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-with-command", UID:"f50efcf5-9fc3-4b91-9803-d371853608cd", APIVersion:"apps/v1", ResourceVersion:"2059", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I0114 17:37:51.377448   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-with-command-757c6f58dd", UID:"9c481f98-113f-4331-9140-cb03eec2af3f", APIVersion:"apps/v1", ResourceVersion:"2060", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-zvdgl
E0114 17:37:51.436829   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 17:37:51.544268   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-with-command" deleted
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/deployment-with-unixuserid created
I0114 17:37:51.867429   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"deployment-with-unixuserid", UID:"70891969-37ac-4109-9cac-dc9c89fdf401", APIVersion:"apps/v1", ResourceVersion:"2073", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I0114 17:37:51.870367   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"a168636e-6aa2-418d-b7d6-275699a8178d", APIVersion:"apps/v1", ResourceVersion:"2074", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-tpwfz
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:52.190905   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:52.296022   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 17:37:52.370938   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"822b761a-a498-4c7d-a8e7-27d51989816f", APIVersion:"apps/v1", ResourceVersion:"2089", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 17:37:52.376306   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6986c7bc94", UID:"86b5db03-d5d1-47ce-a511-0af628c4a9a8", APIVersion:"apps/v1", ResourceVersion:"2090", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-vzf7c
I0114 17:37:52.381438   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6986c7bc94", UID:"86b5db03-d5d1-47ce-a511-0af628c4a9a8", APIVersion:"apps/v1", ResourceVersion:"2090", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-q24v7
I0114 17:37:52.381489   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6986c7bc94", UID:"86b5db03-d5d1-47ce-a511-0af628c4a9a8", APIVersion:"apps/v1", ResourceVersion:"2090", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-vrbzj
E0114 17:37:52.438263   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(BE0114 17:37:52.545457   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 17:37:52.966071   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"c2b1a053-9c40-409d-8109-30e9ac7229d7", APIVersion:"apps/v1", ResourceVersion:"2111", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I0114 17:37:52.971985   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-7f6fc565b9", UID:"40f75a65-a163-43c1-911a-f943e395b9ce", APIVersion:"apps/v1", ResourceVersion:"2112", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-xsrsk
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Bdeployment.apps "nginx-deployment" deleted
E0114 17:37:53.191934   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:53.297395   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(BE0114 17:37:53.439592   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:53.546891   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "nginx-deployment-7f6fc565b9" deleted
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 17:37:53.864935   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"944800b2-1e7e-4875-a9b2-c4b8e8b6be2f", APIVersion:"apps/v1", ResourceVersion:"2130", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 17:37:53.869168   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6986c7bc94", UID:"8c720c59-8744-45a4-8a82-c8f3ab7bddfe", APIVersion:"apps/v1", ResourceVersion:"2131", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-gpn2z
I0114 17:37:53.872763   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6986c7bc94", UID:"8c720c59-8744-45a4-8a82-c8f3ab7bddfe", APIVersion:"apps/v1", ResourceVersion:"2131", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-2wxlp
I0114 17:37:53.873402   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6986c7bc94", UID:"8c720c59-8744-45a4-8a82-c8f3ab7bddfe", APIVersion:"apps/v1", ResourceVersion:"2131", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-pcgzv
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bhorizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
E0114 17:37:54.193157   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
E0114 17:37:54.298765   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0114 17:37:54.440771   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:37:54.548066   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I0114 17:37:54.677148   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx", UID:"e4367db5-77b4-40ed-a133-e3d6676355c6", APIVersion:"apps/v1", ResourceVersion:"2156", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 17:37:54.682719   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-f87d999f7", UID:"37846eff-2f98-4c6d-9bd9-a11e79bea30a", APIVersion:"apps/v1", ResourceVersion:"2157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-96hnr
I0114 17:37:54.684694   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-f87d999f7", UID:"37846eff-2f98-4c6d-9bd9-a11e79bea30a", APIVersion:"apps/v1", ResourceVersion:"2157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-ltjzc
I0114 17:37:54.687358   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-f87d999f7", UID:"37846eff-2f98-4c6d-9bd9-a11e79bea30a", APIVersion:"apps/v1", ResourceVersion:"2157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-7s7wv
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bapps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 17:37:55.194453   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I0114 17:37:55.295680   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx", UID:"e4367db5-77b4-40ed-a133-e3d6676355c6", APIVersion:"apps/v1", ResourceVersion:"2170", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I0114 17:37:55.300311   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-78487f9fd7", UID:"6000293b-dea4-437c-b3ab-5b6812c50ea2", APIVersion:"apps/v1", ResourceVersion:"2171", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-w59pw
E0114 17:37:55.300726   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 17:37:55.442056   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    Image:	k8s.gcr.io/nginx:test-cmd
E0114 17:37:55.549297   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
E0114 17:37:56.195701   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:56.302126   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:56.443401   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:56.550669   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
E0114 17:37:57.196931   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:57.303408   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:57.444822   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:57.551920   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:58.198305   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 17:37:58.304615   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx paused
E0114 17:37:58.446012   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
E0114 17:37:58.553034   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
E0114 17:37:59.199613   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx restarted
E0114 17:37:59.305623   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:37:59.310541   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx", UID:"e4367db5-77b4-40ed-a133-e3d6676355c6", APIVersion:"apps/v1", ResourceVersion:"2202", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-f87d999f7 to 2
I0114 17:37:59.318057   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx", UID:"e4367db5-77b4-40ed-a133-e3d6676355c6", APIVersion:"apps/v1", ResourceVersion:"2204", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-5c4bc6cc8b to 1
I0114 17:37:59.318962   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-f87d999f7", UID:"37846eff-2f98-4c6d-9bd9-a11e79bea30a", APIVersion:"apps/v1", ResourceVersion:"2206", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-f87d999f7-96hnr
I0114 17:37:59.322752   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-5c4bc6cc8b", UID:"08b8b46d-4516-4ea8-9ed4-7223aef770e3", APIVersion:"apps/v1", ResourceVersion:"2209", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5c4bc6cc8b-p65x2
E0114 17:37:59.447243   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:37:59.554409   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:00.200927   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:00.307122   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:00.448503   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 116 lines ...
      terminationGracePeriodSeconds: 30
status:
  fullyLabeledReplicas: 1
  observedGeneration: 2
  replicas: 1
has:deployment.kubernetes.io/revision: "6"
E0114 17:38:00.555679   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:38:00.721622   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx2", UID:"891513ca-e5c3-4b3e-ad80-a947c29feba4", APIVersion:"apps/v1", ResourceVersion:"2225", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
deployment.apps/nginx2 created
I0114 17:38:00.725400   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx2-57b7865cd9", UID:"503b92d5-8b12-4fb0-b1b7-fb55d9c658c9", APIVersion:"apps/v1", ResourceVersion:"2226", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-f999b
I0114 17:38:00.740139   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx2-57b7865cd9", UID:"503b92d5-8b12-4fb0-b1b7-fb55d9c658c9", APIVersion:"apps/v1", ResourceVersion:"2226", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-bjxs2
I0114 17:38:00.740594   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx2-57b7865cd9", UID:"503b92d5-8b12-4fb0-b1b7-fb55d9c658c9", APIVersion:"apps/v1", ResourceVersion:"2226", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-75tlb
deployment.apps "nginx2" deleted
deployment.apps "nginx" deleted
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:01.202387   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 17:38:01.232154   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"02995e42-ec80-46d1-b4fc-22124a7acab7", APIVersion:"apps/v1", ResourceVersion:"2259", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0114 17:38:01.240441   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"3af4b050-0bf9-4fa7-bf06-70460424e386", APIVersion:"apps/v1", ResourceVersion:"2260", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-6b74f
I0114 17:38:01.243607   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"3af4b050-0bf9-4fa7-bf06-70460424e386", APIVersion:"apps/v1", ResourceVersion:"2260", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-rcl5w
I0114 17:38:01.245200   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"3af4b050-0bf9-4fa7-bf06-70460424e386", APIVersion:"apps/v1", ResourceVersion:"2260", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-dk9k7
E0114 17:38:01.308509   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0114 17:38:01.449817   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BI0114 17:38:01.529631   54630 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1579023456-17806
E0114 17:38:01.556864   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0114 17:38:01.687864   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"02995e42-ec80-46d1-b4fc-22124a7acab7", APIVersion:"apps/v1", ResourceVersion:"2273", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I0114 17:38:01.690780   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-59df9b5f5b", UID:"456f5d56-1c9c-46cc-843d-fde902cb1d45", APIVersion:"apps/v1", ResourceVersion:"2274", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-m7tsz
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
E0114 17:38:02.203464   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 17:38:02.309845   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
E0114 17:38:02.450884   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 17:38:02.558250   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bapps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0114 17:38:03.064068   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"02995e42-ec80-46d1-b4fc-22124a7acab7", APIVersion:"apps/v1", ResourceVersion:"2293", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0114 17:38:03.071848   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"3af4b050-0bf9-4fa7-bf06-70460424e386", APIVersion:"apps/v1", ResourceVersion:"2297", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-6b74f
I0114 17:38:03.072695   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"02995e42-ec80-46d1-b4fc-22124a7acab7", APIVersion:"apps/v1", ResourceVersion:"2296", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I0114 17:38:03.084226   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-7d758dbc54", UID:"0b1deee5-7d60-40b2-b5a9-fb7cbae86040", APIVersion:"apps/v1", ResourceVersion:"2301", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-z6sfv
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 17:38:03.204805   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 17:38:03.311371   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:03.452195   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 17:38:03.559552   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 17:38:03.997982   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2325", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0114 17:38:04.004752   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"89c4167b-1508-48c2-a56b-723fa81a9ac9", APIVersion:"apps/v1", ResourceVersion:"2326", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-2zff7
I0114 17:38:04.009077   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"89c4167b-1508-48c2-a56b-723fa81a9ac9", APIVersion:"apps/v1", ResourceVersion:"2326", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-f2bq5
I0114 17:38:04.009435   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"89c4167b-1508-48c2-a56b-723fa81a9ac9", APIVersion:"apps/v1", ResourceVersion:"2326", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-bc2qz
configmap/test-set-env-config created
E0114 17:38:04.205944   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:04.312531   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-set-env-secret created
E0114 17:38:04.453388   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0114 17:38:04.560729   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(Bapps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(Bdeployment.apps/nginx-deployment env updated
I0114 17:38:04.811564   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2343", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I0114 17:38:04.816077   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6b9f7756b4", UID:"ba1439eb-7496-466c-86b1-2d9a1d38d227", APIVersion:"apps/v1", ResourceVersion:"2344", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-bsrd5
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
(Bapps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
(Bdeployment.apps/nginx-deployment env updated
I0114 17:38:05.148357   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2353", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0114 17:38:05.156931   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"89c4167b-1508-48c2-a56b-723fa81a9ac9", APIVersion:"apps/v1", ResourceVersion:"2357", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-f2bq5
I0114 17:38:05.177511   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2356", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-754bf964c8 to 1
I0114 17:38:05.181256   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-754bf964c8", UID:"d54595af-8e56-4b42-af27-8195629e9cb4", APIVersion:"apps/v1", ResourceVersion:"2364", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-754bf964c8-7jxbt
E0114 17:38:05.206953   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:389: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
(BE0114 17:38:05.313813   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 17:38:05.397076   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2373", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 1
I0114 17:38:05.402877   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"89c4167b-1508-48c2-a56b-723fa81a9ac9", APIVersion:"apps/v1", ResourceVersion:"2377", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-bc2qz
I0114 17:38:05.410270   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2375", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-c6d5c5c7b to 1
I0114 17:38:05.415084   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-c6d5c5c7b", UID:"64001738-7305-4c03-bbc3-eff7f17e61e1", APIVersion:"apps/v1", ResourceVersion:"2383", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-c6d5c5c7b-z6fnk
E0114 17:38:05.454889   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 17:38:05.533957   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2393", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 0
I0114 17:38:05.545846   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2395", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
E0114 17:38:05.561674   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 17:38:05.663954   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2402", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6b9f7756b4 to 0
I0114 17:38:05.669726   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-598d4d68b4", UID:"89c4167b-1508-48c2-a56b-723fa81a9ac9", APIVersion:"apps/v1", ResourceVersion:"2396", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-2zff7
deployment.apps/nginx-deployment env updated
I0114 17:38:05.818034   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-6b9f7756b4", UID:"ba1439eb-7496-466c-86b1-2d9a1d38d227", APIVersion:"apps/v1", ResourceVersion:"2405", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6b9f7756b4-bsrd5
I0114 17:38:05.862475   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment", UID:"4acb6964-0794-4047-bb77-86a40e8df2dc", APIVersion:"apps/v1", ResourceVersion:"2410", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-d74969475 to 1
deployment.apps/nginx-deployment env updated
deployment.apps "nginx-deployment" deleted
I0114 17:38:06.064641   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-d74969475", UID:"c676a8ce-caa9-409c-aadb-f060da7e13be", APIVersion:"apps/v1", ResourceVersion:"2413", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-d74969475-wgp9p
configmap "test-set-env-config" deleted
secret "test-set-env-secret" deleted
E0114 17:38:06.208217   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
I0114 17:38:06.218118   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023469-30169", Name:"nginx-deployment-5958f7687", UID:"ea1eb73a-c834-442f-a414-158b22ffa86a", APIVersion:"apps/v1", ResourceVersion:"2398", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-6qzxs
Recording: run_rs_tests
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [0114 17:38:06] Creating namespace namespace-1579023486-1150
E0114 17:38:06.313799   54630 replica_set.go:534] sync "namespace-1579023469-30169/nginx-deployment-6b9f7756b4" failed with replicasets.apps "nginx-deployment-6b9f7756b4" not found
E0114 17:38:06.314707   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023486-1150 created
E0114 17:38:06.413691   54630 replica_set.go:534] sync "namespace-1579023469-30169/nginx-deployment-598d4d68b4" failed with replicasets.apps "nginx-deployment-598d4d68b4" not found
E0114 17:38:06.456140   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 17:38:06] Testing kubectl(v1:replicasets)
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:06.563003   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:06.563825   54630 replica_set.go:534] sync "namespace-1579023469-30169/nginx-deployment-868b664cb5" failed with replicasets.apps "nginx-deployment-868b664cb5" not found
E0114 17:38:06.613733   54630 replica_set.go:534] sync "namespace-1579023469-30169/nginx-deployment-d74969475" failed with replicasets.apps "nginx-deployment-d74969475" not found
E0114 17:38:06.713534   54630 replica_set.go:534] sync "namespace-1579023469-30169/nginx-deployment-5958f7687" failed with replicasets.apps "nginx-deployment-5958f7687" not found
replicaset.apps/frontend created
+++ [0114 17:38:06] Deleting rs
I0114 17:38:06.767177   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"d34b8632-e846-4230-b6d9-b127f6e271cc", APIVersion:"apps/v1", ResourceVersion:"2448", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qr94j
replicaset.apps "frontend" deleted
I0114 17:38:06.865583   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"d34b8632-e846-4230-b6d9-b127f6e271cc", APIVersion:"apps/v1", ResourceVersion:"2448", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gppjb
I0114 17:38:06.915806   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"d34b8632-e846-4230-b6d9-b127f6e271cc", APIVersion:"apps/v1", ResourceVersion:"2448", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fzw8q
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:07.114106   54630 replica_set.go:534] sync "namespace-1579023486-1150/frontend" failed with replicasets.apps "frontend" not found
E0114 17:38:07.209431   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 17:38:07.258109   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"8f3af839-a767-4fd9-893e-4480a87df863", APIVersion:"apps/v1", ResourceVersion:"2462", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bsprq
I0114 17:38:07.261758   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"8f3af839-a767-4fd9-893e-4480a87df863", APIVersion:"apps/v1", ResourceVersion:"2462", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-58vmj
I0114 17:38:07.264486   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"8f3af839-a767-4fd9-893e-4480a87df863", APIVersion:"apps/v1", ResourceVersion:"2462", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hfpd8
E0114 17:38:07.316076   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:525: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0114 17:38:07] Deleting rs
E0114 17:38:07.457318   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
E0114 17:38:07.563365   54630 replica_set.go:534] sync "namespace-1579023486-1150/frontend" failed with replicasets.apps "frontend" not found
E0114 17:38:07.563991   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:529: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:531: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-58vmj" deleted
pod "frontend-bsprq" deleted
pod "frontend-hfpd8" deleted
apps.sh:534: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:538: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 17:38:08.186744   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"5d13147e-d68c-4722-ad85-c3702242582f", APIVersion:"apps/v1", ResourceVersion:"2483", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jnd7k
I0114 17:38:08.190467   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"5d13147e-d68c-4722-ad85-c3702242582f", APIVersion:"apps/v1", ResourceVersion:"2483", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-c89k5
I0114 17:38:08.190529   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"5d13147e-d68c-4722-ad85-c3702242582f", APIVersion:"apps/v1", ResourceVersion:"2483", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pqxcv
E0114 17:38:08.210650   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:542: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 17:38:08.317402   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 3 lines ...
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-jnd7k
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-c89k5
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-pqxcv
(BE0114 17:38:08.458510   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:546: Successful describe
Name:         frontend
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 10 lines ...
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-jnd7k
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-c89k5
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-pqxcv
(B
E0114 17:38:08.565737   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:548: Successful describe
Name:         frontend
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE0114 17:38:09.211857   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1579023486-1150
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-jnd7k
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-c89k5
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-pqxcv
(BE0114 17:38:09.318642   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Image:
matched Node:
matched Labels:
matched Status:
matched Controlled By
... skipping 80 lines ...
    Mounts:            <none>
Volumes:               <none>
QoS Class:             Burstable
Node-Selectors:        <none>
Tolerations:           <none>
Events:                <none>
(BE0114 17:38:09.459878   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:564: Successful get rs frontend {{.spec.replicas}}: 3
(BE0114 17:38:09.567141   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend scaled
E0114 17:38:09.631118   54630 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1579023486-1150 /apis/apps/v1/namespaces/namespace-1579023486-1150/replicasets/frontend 5d13147e-d68c-4722-ad85-c3702242582f 2492 2 2020-01-14 17:38:08 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002862598 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 17:38:09.637296   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"5d13147e-d68c-4722-ad85-c3702242582f", APIVersion:"apps/v1", ResourceVersion:"2492", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-c89k5
apps.sh:568: Successful get rs frontend {{.spec.replicas}}: 2
(Bdeployment.apps/scale-1 created
I0114 17:38:09.967290   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-1", UID:"8384b47c-3630-48fb-8b63-b1e568021855", APIVersion:"apps/v1", ResourceVersion:"2498", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I0114 17:38:09.969610   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-1-5c5565bcd9", UID:"b2e0dfeb-8d18-4bf3-8843-8a2869b8088e", APIVersion:"apps/v1", ResourceVersion:"2499", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-jvpc4
deployment.apps/scale-2 created
I0114 17:38:10.198285   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-2", UID:"79e226ee-8772-4f9c-9a5f-3f2175efef73", APIVersion:"apps/v1", ResourceVersion:"2510", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I0114 17:38:10.202663   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-2-5c5565bcd9", UID:"93a1b6c0-7b0b-433a-9964-01e3a2aacca1", APIVersion:"apps/v1", ResourceVersion:"2511", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-jvh42
E0114 17:38:10.212754   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:10.319895   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-3 created
I0114 17:38:10.423447   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-3", UID:"d74daae5-6a2d-44d4-b34a-8b2be56d3bd9", APIVersion:"apps/v1", ResourceVersion:"2520", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I0114 17:38:10.428037   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-3-5c5565bcd9", UID:"9070c095-265e-4252-9402-598c933a363a", APIVersion:"apps/v1", ResourceVersion:"2521", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-6rfx2
E0114 17:38:10.460988   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:574: Successful get deploy scale-1 {{.spec.replicas}}: 1
(BE0114 17:38:10.568430   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:575: Successful get deploy scale-2 {{.spec.replicas}}: 1
(Bapps.sh:576: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I0114 17:38:10.836782   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-1", UID:"8384b47c-3630-48fb-8b63-b1e568021855", APIVersion:"apps/v1", ResourceVersion:"2530", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
deployment.apps/scale-2 scaled
I0114 17:38:10.840545   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-1-5c5565bcd9", UID:"b2e0dfeb-8d18-4bf3-8843-8a2869b8088e", APIVersion:"apps/v1", ResourceVersion:"2531", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-bmh9r
I0114 17:38:10.846274   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-2", UID:"79e226ee-8772-4f9c-9a5f-3f2175efef73", APIVersion:"apps/v1", ResourceVersion:"2532", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
I0114 17:38:10.851200   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-2-5c5565bcd9", UID:"93a1b6c0-7b0b-433a-9964-01e3a2aacca1", APIVersion:"apps/v1", ResourceVersion:"2538", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-dmddg
apps.sh:579: Successful get deploy scale-1 {{.spec.replicas}}: 2
(Bapps.sh:580: Successful get deploy scale-2 {{.spec.replicas}}: 2
(Bapps.sh:581: Successful get deploy scale-3 {{.spec.replicas}}: 1
(BE0114 17:38:11.214098   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 scaled
I0114 17:38:11.242423   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-1", UID:"8384b47c-3630-48fb-8b63-b1e568021855", APIVersion:"apps/v1", ResourceVersion:"2550", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 3
deployment.apps/scale-2 scaled
I0114 17:38:11.247095   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-1-5c5565bcd9", UID:"b2e0dfeb-8d18-4bf3-8843-8a2869b8088e", APIVersion:"apps/v1", ResourceVersion:"2551", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-4jf9g
I0114 17:38:11.251199   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-2", UID:"79e226ee-8772-4f9c-9a5f-3f2175efef73", APIVersion:"apps/v1", ResourceVersion:"2552", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 3
deployment.apps/scale-3 scaled
I0114 17:38:11.256286   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-2-5c5565bcd9", UID:"93a1b6c0-7b0b-433a-9964-01e3a2aacca1", APIVersion:"apps/v1", ResourceVersion:"2558", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-9nzmh
I0114 17:38:11.278851   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023486-1150", Name:"scale-3", UID:"d74daae5-6a2d-44d4-b34a-8b2be56d3bd9", APIVersion:"apps/v1", ResourceVersion:"2559", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 3
I0114 17:38:11.285032   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-3-5c5565bcd9", UID:"9070c095-265e-4252-9402-598c933a363a", APIVersion:"apps/v1", ResourceVersion:"2566", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-5v5bv
I0114 17:38:11.289182   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"scale-3-5c5565bcd9", UID:"9070c095-265e-4252-9402-598c933a363a", APIVersion:"apps/v1", ResourceVersion:"2566", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-6hkwz
E0114 17:38:11.320878   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:584: Successful get deploy scale-1 {{.spec.replicas}}: 3
(BE0114 17:38:11.462807   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:585: Successful get deploy scale-2 {{.spec.replicas}}: 3
(BE0114 17:38:11.569744   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:586: Successful get deploy scale-3 {{.spec.replicas}}: 3
(Breplicaset.apps "frontend" deleted
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
replicaset.apps/frontend created
I0114 17:38:12.029493   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"596bb6ee-bda5-4167-80cd-8f64bf127dfe", APIVersion:"apps/v1", ResourceVersion:"2611", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nbwf4
I0114 17:38:12.033628   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"596bb6ee-bda5-4167-80cd-8f64bf127dfe", APIVersion:"apps/v1", ResourceVersion:"2611", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g45vc
I0114 17:38:12.034975   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"596bb6ee-bda5-4167-80cd-8f64bf127dfe", APIVersion:"apps/v1", ResourceVersion:"2611", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fkq92
apps.sh:594: Successful get rs frontend {{.spec.replicas}}: 3
(BE0114 17:38:12.215239   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
E0114 17:38:12.322062   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:598: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
E0114 17:38:12.463955   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:602: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(BE0114 17:38:12.570922   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "frontend" deleted
service "frontend-2" deleted
apps.sh:608: Successful get rs frontend {{.metadata.generation}}: 1
(Breplicaset.apps/frontend image updated
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 2
(Breplicaset.apps/frontend env updated
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 3
(BE0114 17:38:13.216507   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend resource requirements updated
E0114 17:38:13.323289   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 4
(BE0114 17:38:13.465179   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:618: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 17:38:13.572049   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
apps.sh:622: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:626: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 17:38:13.954508   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"f472338e-6151-41b0-b069-7c49b9d897aa", APIVersion:"apps/v1", ResourceVersion:"2648", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lkdjx
I0114 17:38:13.957109   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"f472338e-6151-41b0-b069-7c49b9d897aa", APIVersion:"apps/v1", ResourceVersion:"2648", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-29hhg
I0114 17:38:13.961547   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"f472338e-6151-41b0-b069-7c49b9d897aa", APIVersion:"apps/v1", ResourceVersion:"2648", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nps6p
replicaset.apps/redis-slave created
I0114 17:38:14.154633   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"redis-slave", UID:"e734cd1e-8c45-478e-b537-efb87124f960", APIVersion:"apps/v1", ResourceVersion:"2659", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-j8dvj
I0114 17:38:14.167924   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"redis-slave", UID:"e734cd1e-8c45-478e-b537-efb87124f960", APIVersion:"apps/v1", ResourceVersion:"2659", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-ctntr
E0114 17:38:14.217784   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:631: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0114 17:38:14.324438   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:635: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicaset.apps "frontend" deleted
replicaset.apps "redis-slave" deleted
E0114 17:38:14.466059   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:14.573181   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:639: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:644: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 17:38:14.865258   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"6d137bc4-ec76-4561-8042-3ed3b604959d", APIVersion:"apps/v1", ResourceVersion:"2678", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bvlzp
I0114 17:38:14.869431   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"6d137bc4-ec76-4561-8042-3ed3b604959d", APIVersion:"apps/v1", ResourceVersion:"2678", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-k8bnm
I0114 17:38:14.871527   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023486-1150", Name:"frontend", UID:"6d137bc4-ec76-4561-8042-3ed3b604959d", APIVersion:"apps/v1", ResourceVersion:"2678", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qknh2
apps.sh:647: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:650: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(BE0114 17:38:15.219092   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E0114 17:38:15.325616   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 17:38:15.467296   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:654: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
E0114 17:38:15.574608   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
+++ [0114 17:38:15] Creating namespace namespace-1579023495-29034
namespace/namespace-1579023495-29034 created
Context "test" modified.
+++ [0114 17:38:16] Testing kubectl(v1:statefulsets)
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:16.220214   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:38:16.317425   51184 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
E0114 17:38:16.326526   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(BE0114 17:38:16.468468   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(BE0114 17:38:16.575909   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx scaled
I0114 17:38:16.643126   54630 event.go:278] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1579023495-29034", Name:"nginx", UID:"af1f20cb-8fd9-486d-9a95-f26f5c9ed4dd", APIVersion:"apps/v1", ResourceVersion:"2705", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(Bstatefulset.apps/nginx restarted
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(BE0114 17:38:17.221316   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0114 17:38:17.222114   54630 stateful_set.go:420] StatefulSet has been deleted namespace-1579023495-29034/nginx
E0114 17:38:17.327900   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_statefulset_history_tests
Running command: run_statefulset_history_tests

+++ Running case: test-cmd.run_statefulset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_statefulset_history_tests
+++ [0114 17:38:17] Creating namespace namespace-1579023497-15332
E0114 17:38:17.469729   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023497-15332 created
E0114 17:38:17.577085   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 17:38:17] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstatefulset.apps/nginx created
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1579023497-15332"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bstatefulset.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 17:38:18.222511   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 17:38:18.329092   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:18.471127   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx configured
E0114 17:38:18.578168   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1579023497-15332"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1579023497-15332"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
... skipping 11 lines ...
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 17:38:19.223868   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:19.330454   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bstatefulset.apps/nginx rolled back
E0114 17:38:19.472492   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 17:38:19.579439   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
E0114 17:38:20.225066   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE0114 17:38:20.331610   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 17:38:20.473867   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 17:38:20.580583   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0114 17:38:20.599022   54630 stateful_set.go:420] StatefulSet has been deleted namespace-1579023497-15332/nginx
+++ exit code: 0
Recording: run_lists_tests
Running command: run_lists_tests

... skipping 5 lines ...
Context "test" modified.
+++ [0114 17:38:20] Testing kubectl(v1:lists)
service/list-service-test created
deployment.apps/list-deployment-test created
I0114 17:38:21.192046   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023500-19508", Name:"list-deployment-test", UID:"83c708b3-35a2-447a-b338-c36bfe063752", APIVersion:"apps/v1", ResourceVersion:"2743", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7cd8c5ff6d to 1
I0114 17:38:21.203304   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023500-19508", Name:"list-deployment-test-7cd8c5ff6d", UID:"9c5a99a1-237c-407d-ab1d-40430b13ab08", APIVersion:"apps/v1", ResourceVersion:"2744", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: list-deployment-test-7cd8c5ff6d-dvsbq
E0114 17:38:21.226117   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "list-service-test" deleted
deployment.apps "list-deployment-test" deleted
+++ exit code: 0
E0114 17:38:21.332847   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_multi_resources_tests
Running command: run_multi_resources_tests

+++ Running case: test-cmd.run_multi_resources_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_multi_resources_tests
+++ [0114 17:38:21] Creating namespace namespace-1579023501-17552
E0114 17:38:21.475086   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023501-17552 created
Context "test" modified.
E0114 17:38:21.581922   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 17:38:21] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I0114 17:38:21.998750   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"391d765b-0173-4b53-98f5-8cb0e8c3ce6b", APIVersion:"v1", ResourceVersion:"2765", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-9kwmj
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 17:38:22.227404   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.103   <none>        99/TCP    1s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       1s
E0114 17:38:22.334036   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:22.476351   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1579023501-17552
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1579023501-17552
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-9kwmj
E0114 17:38:22.583277   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 17:38:22.748352   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"b7edc087-4815-4242-a18f-f50724262dc3", APIVersion:"v1", ResourceVersion:"2781", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-qz82l
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
E0114 17:38:23.228559   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 17:38:23.335335   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 17:38:23.477557   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
E0114 17:38:23.584763   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
E0114 17:38:24.229673   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:24.336600   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:24.478965   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 17:38:24.538586   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"880f6767-76bc-4c56-ad8e-e29fb4ddecdd", APIVersion:"v1", ResourceVersion:"2807", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-b8vs9
E0114 17:38:24.586018   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.10    <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
... skipping 15 lines ...
Name:         mock
Namespace:    namespace-1579023501-17552
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-b8vs9
E0114 17:38:25.230994   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 17:38:25.307335   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"7025c100-c48a-4b11-b405-abae53cde8e9", APIVersion:"v1", ResourceVersion:"2821", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-zvv9q
E0114 17:38:25.337864   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0114 17:38:25.480300   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0114 17:38:25.587475   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0114 17:38:26.232291   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0114 17:38:26.339330   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0114 17:38:26.481581   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 17:38:26.588759   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I0114 17:38:27.104573   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"b0913c70-38ae-492c-9398-3689057566ff", APIVersion:"v1", ResourceVersion:"2845", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-qprz2
E0114 17:38:27.233595   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 17:38:27.340495   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.224   <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
E0114 17:38:27.482929   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:27.589944   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1579023501-17552
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1579023501-17552
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 9 lines ...
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 17:38:27.869987   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"023f4fa2-cd09-4070-a7bc-f06cdab07a10", APIVersion:"v1", ResourceVersion:"2859", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-kq6bv
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0114 17:38:28.234911   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
E0114 17:38:28.341793   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 17:38:28.484177   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 17:38:28.591436   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 17:38:29.237237   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
E0114 17:38:29.343228   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:29.485440   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:29.592738   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock created
replicationcontroller/mock2 created
I0114 17:38:29.679387   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"c8967155-5a76-46ee-86e6-b4d770550fe0", APIVersion:"v1", ResourceVersion:"2881", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-ck29t
I0114 17:38:29.683666   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock2", UID:"7eed92f3-6152-4ab6-b794-de754e86ef29", APIVersion:"v1", ResourceVersion:"2883", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-rzwf7
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    DESIRED   CURRENT   READY   AGE
... skipping 4 lines ...
Namespace:    namespace-1579023501-17552
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1579023501-17552
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock2-rzwf7
E0114 17:38:30.238498   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:30.344518   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
replicationcontroller/mock replaced
replicationcontroller/mock2 replaced
I0114 17:38:30.389359   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"80e34793-a5e8-42a3-a9d3-d5adffc18568", APIVersion:"v1", ResourceVersion:"2899", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-2xlzc
I0114 17:38:30.391518   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock2", UID:"a4b71d24-6de5-4bf9-80e1-c6aa56bab7b9", APIVersion:"v1", ResourceVersion:"2900", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-5k5kh
E0114 17:38:30.486550   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0114 17:38:30.594024   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:104: Successful get rc mock2 {{.metadata.labels.status}}: replaced
(Breplicationcontroller/mock edited
replicationcontroller/mock2 edited
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(Breplicationcontroller/mock labeled
replicationcontroller/mock2 labeled
E0114 17:38:31.239640   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0114 17:38:31.345784   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(Breplicationcontroller/mock annotated
E0114 17:38:31.487857   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock2 annotated
E0114 17:38:31.595246   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(Breplicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
service/mock2 created
E0114 17:38:32.241017   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:32.347283   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
mock    ClusterIP   10.0.0.78    <none>        99/TCP    0s
mock2   ClusterIP   10.0.0.242   <none>        99/TCP    0s
E0114 17:38:32.489103   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:32.596514   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1579023501-17552
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 20 lines ...
service "mock" deleted
service "mock2" deleted
service/mock replaced
service/mock2 replaced
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(BE0114 17:38:33.242313   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:33.348836   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
service/mock2 edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 17:38:33.490871   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(BE0114 17:38:33.597857   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
service/mock2 labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
service/mock2 annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(BE0114 17:38:34.243638   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
service "mock2" deleted
E0114 17:38:34.350137   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:34.492294   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:34.599229   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 17:38:34.971742   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023501-17552", Name:"mock", UID:"4f110ccb-7ffa-4634-b933-82cb32d2366e", APIVersion:"v1", ResourceVersion:"2963", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-9w5qn
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 17:38:35.244944   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:35.351747   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
E0114 17:38:35.493663   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:35.601404   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volumes_tests
Running command: run_persistent_volumes_tests

+++ Running case: test-cmd.run_persistent_volumes_tests 
... skipping 2 lines ...
+++ [0114 17:38:35] Creating namespace namespace-1579023515-10948
namespace/namespace-1579023515-10948 created
Context "test" modified.
+++ [0114 17:38:35] Testing persistent volumes
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
E0114 17:38:36.187289   54630 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
E0114 17:38:36.246266   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BE0114 17:38:36.353084   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0001" deleted
E0114 17:38:36.495082   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0002 created
E0114 17:38:36.602808   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(Bpersistentvolume "pv0002" deleted
persistentvolume/pv0003 created
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(Bpersistentvolume "pv0003" deleted
E0114 17:38:37.247461   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:37.354376   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
E0114 17:38:37.496038   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:37.496785   54630 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
E0114 17:38:37.604047   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
Successful
... skipping 10 lines ...
+++ command: run_persistent_volume_claims_tests
+++ [0114 17:38:37] Creating namespace namespace-1579023517-2079
namespace/namespace-1579023517-2079 created
Context "test" modified.
+++ [0114 17:38:38] Testing persistent volumes claims
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:38.248517   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:38.355750   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-1 created
I0114 17:38:38.426155   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-1", UID:"e9e462fa-b8a3-4f11-8646-273a28680f19", APIVersion:"v1", ResourceVersion:"3002", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 17:38:38.429073   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-1", UID:"e9e462fa-b8a3-4f11-8646-273a28680f19", APIVersion:"v1", ResourceVersion:"3003", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 17:38:38.497370   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(BE0114 17:38:38.605279   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim "myclaim-1" deleted
I0114 17:38:38.632414   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-1", UID:"e9e462fa-b8a3-4f11-8646-273a28680f19", APIVersion:"v1", ResourceVersion:"3006", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-2 created
I0114 17:38:38.828771   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-2", UID:"b6d14264-f785-470b-b216-8523b8162ed3", APIVersion:"v1", ResourceVersion:"3009", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 17:38:38.832061   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-2", UID:"b6d14264-f785-470b-b216-8523b8162ed3", APIVersion:"v1", ResourceVersion:"3010", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(Bpersistentvolumeclaim "myclaim-2" deleted
I0114 17:38:39.046686   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-2", UID:"b6d14264-f785-470b-b216-8523b8162ed3", APIVersion:"v1", ResourceVersion:"3013", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-3 created
I0114 17:38:39.247223   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-3", UID:"8eff7e65-95dd-46be-a690-5f443d07ef1f", APIVersion:"v1", ResourceVersion:"3016", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 17:38:39.249936   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:38:39.252219   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-3", UID:"8eff7e65-95dd-46be-a690-5f443d07ef1f", APIVersion:"v1", ResourceVersion:"3018", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 17:38:39.356935   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
(Bpersistentvolumeclaim "myclaim-3" deleted
I0114 17:38:39.487086   54630 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579023517-2079", Name:"myclaim-3", UID:"8eff7e65-95dd-46be-a690-5f443d07ef1f", APIVersion:"v1", ResourceVersion:"3020", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 17:38:39.507186   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:39.606436   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_storage_class_tests
Running command: run_storage_class_tests

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [0114 17:38:39] Testing storage class
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstorageclass.storage.k8s.io/storage-class-name created
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE0114 17:38:40.251005   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE0114 17:38:40.358503   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storageclass.storage.k8s.io "storage-class-name" deleted
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:38:40.511492   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_nodes_tests
Running command: run_nodes_tests

+++ Running case: test-cmd.run_nodes_tests 
E0114 17:38:40.607805   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_nodes_tests
+++ [0114 17:38:40] Testing kubectl(v1:nodes)
core.sh:1375: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bmatched Name:
matched Labels:
... skipping 138 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
(B
E0114 17:38:41.252234   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1383: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 17:34:10 +0000
... skipping 35 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(B
E0114 17:38:41.359821   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
... skipping 41 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0114 17:38:41.512903   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 17:34:10 +0000
... skipping 34 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0114 17:38:41.609205   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 17:34:10 +0000
... skipping 81 lines ...
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(Bcore.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 patched
E0114 17:38:42.253682   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1398: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(BE0114 17:38:42.361847   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
core.sh:1401: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 17:38:42.514572   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
E0114 17:38:42.610639   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests

+++ Running case: test-cmd.run_authorization_tests 
... skipping 62 lines ...
  "status": {
    "allowed": true,
    "reason": "RBAC: allowed by ClusterRoleBinding \"super-group\" of ClusterRole \"admin\" to Group \"the-group\""
  }
}
+++ exit code: 0
E0114 17:38:43.255324   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
E0114 17:38:43.363502   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
E0114 17:38:43.515904   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:43.612117   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
E0114 17:38:44.256944   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:44.364699   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:0
has:0
Successful
message:yes
has not:Warning
E0114 17:38:44.517422   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:44.613498   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'foo'
yes
has:Warning: the server doesn't have a resource type 'foo'
Successful
message:Warning: the server doesn't have a resource type 'foo'
... skipping 6 lines ...
message:Warning: resource 'nodes' is not namespace scoped
yes
has:Warning: resource 'nodes' is not namespace scoped
Successful
message:yes
has not:Warning: resource 'nodes' is not namespace scoped
E0114 17:38:45.258520   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io/testing-CR reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled
	reconciliation required create
... skipping 4 lines ...
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
role.rbac.authorization.k8s.io/testing-R reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
E0114 17:38:45.366393   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(BE0114 17:38:45.518686   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(BE0114 17:38:45.615061   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 2 lines ...

+++ Running case: test-cmd.run_retrieve_multiple_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_retrieve_multiple_tests
Context "test" modified.
+++ [0114 17:38:46] Testing kubectl(v1:multiget)
E0114 17:38:46.260297   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:242: Successful get nodes/127.0.0.1 service/kubernetes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:kubernetes:
(B+++ exit code: 0
E0114 17:38:46.367772   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_resource_aliasing_tests
Running command: run_resource_aliasing_tests

+++ Running case: test-cmd.run_resource_aliasing_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_resource_aliasing_tests
+++ [0114 17:38:46] Creating namespace namespace-1579023526-29860
E0114 17:38:46.520219   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023526-29860 created
Context "test" modified.
E0114 17:38:46.616519   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 17:38:46] Testing resource aliasing
replicationcontroller/cassandra created
I0114 17:38:46.849928   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023526-29860", Name:"cassandra", UID:"ca94a785-6b0c-4062-b10e-78f205055309", APIVersion:"v1", ResourceVersion:"3050", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-4zbbw
I0114 17:38:46.859600   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023526-29860", Name:"cassandra", UID:"ca94a785-6b0c-4062-b10e-78f205055309", APIVersion:"v1", ResourceVersion:"3050", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-nj4z4
service/cassandra created
Waiting for Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}} : expected: cassandra:cassandra:cassandra:cassandra::, got: cassandra:cassandra:cassandra:cassandra:

discovery.sh:91: FAIL!
Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}
  Expected: cassandra:cassandra:cassandra:cassandra::
  Got:      cassandra:cassandra:cassandra:cassandra:
(B
55 /home/prow/go/src/k8s.io/kubernetes/hack/lib/test.sh
(B
E0114 17:38:47.261475   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
discovery.sh:92: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(BE0114 17:38:47.369211   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "cassandra-4zbbw" deleted
I0114 17:38:47.459194   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023526-29860", Name:"cassandra", UID:"ca94a785-6b0c-4062-b10e-78f205055309", APIVersion:"v1", ResourceVersion:"3056", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-lfkh5
pod "cassandra-nj4z4" deleted
I0114 17:38:47.469062   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023526-29860", Name:"cassandra", UID:"ca94a785-6b0c-4062-b10e-78f205055309", APIVersion:"v1", ResourceVersion:"3065", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-lvs8b
replicationcontroller "cassandra" deleted
service "cassandra" deleted
E0114 17:38:47.521476   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_explain_tests
+++ [0114 17:38:47] Testing kubectl(v1:explain)
E0114 17:38:47.617827   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 62 lines ...

FIELD:    message <string>

DESCRIPTION:
     A human readable message indicating details about why the pod is in this
     condition.
E0114 17:38:48.262914   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     CronJob
VERSION:  batch/v1beta1

DESCRIPTION:
     CronJob represents the configuration of a single cron job.

... skipping 20 lines ...
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

   status	<Object>
     Current status of a cron job. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E0114 17:38:48.370793   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_swagger_tests
Running command: run_swagger_tests

+++ Running case: test-cmd.run_swagger_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_swagger_tests
+++ [0114 17:38:48] Testing swagger
E0114 17:38:48.522839   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_sort_by_tests
Running command: run_kubectl_sort_by_tests
E0114 17:38:48.619134   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [0114 17:38:48] Testing kubectl --sort-by
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BNo resources found in namespace-1579023526-29860 namespace.
No resources found in namespace-1579023526-29860 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
E0114 17:38:49.264155   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 17:38:49.372050   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
Successful
message:I0114 17:38:49.497416   86623 loader.go:375] Config loaded from file:  /tmp/tmp.VFlSiaN67c/.kube/config
... skipping 7 lines ...
I0114 17:38:49.511056   86623 round_trippers.go:452]     Cache-Control: no-cache, private
I0114 17:38:49.511070   86623 round_trippers.go:452]     Content-Type: application/json
I0114 17:38:49.511199   86623 request.go:1022] Response Body: {"kind":"Table","apiVersion":"meta.k8s.io/v1","metadata":{"selfLink":"/api/v1/namespaces/namespace-1579023526-29860/pods","resourceVersion":"3079"},"columnDefinitions":[{"name":"Name","type":"string","format":"name","description":"Name must be unique within a namespace. Is required when creating resources, although some resources may allow a client to request the generation of an appropriate name automatically. Name is primarily intended for creation idempotence and configuration definition. Cannot be updated. More info: http://kubernetes.io/docs/user-guide/identifiers#names","priority":0},{"name":"Ready","type":"string","format":"","description":"The aggregate readiness state of this pod for accepting traffic.","priority":0},{"name":"Status","type":"string","format":"","description":"The aggregate status of the containers in this pod.","priority":0},{"name":"Restarts","type":"integer","format":"","description":"The number of times the containers in this pod have been restarted.","priority":0},{"name":"Age"," [truncated 3569 chars]
NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:as=Table
E0114 17:38:49.524234   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I0114 17:38:49.497416   86623 loader.go:375] Config loaded from file:  /tmp/tmp.VFlSiaN67c/.kube/config
I0114 17:38:49.507922   86623 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1579023526-29860/pods?includeObject=Object
I0114 17:38:49.507952   86623 round_trippers.go:427] Request Headers:
I0114 17:38:49.507960   86623 round_trippers.go:431]     Accept: application/json;as=Table;v=v1;g=meta.k8s.io,application/json;as=Table;v=v1beta1;g=meta.k8s.io,application/json
I0114 17:38:49.507966   86623 round_trippers.go:431]     User-Agent: kubectl/v1.18.0 (linux/amd64) kubernetes/13ea65e
... skipping 3 lines ...
I0114 17:38:49.511056   86623 round_trippers.go:452]     Cache-Control: no-cache, private
I0114 17:38:49.511070   86623 round_trippers.go:452]     Content-Type: application/json
I0114 17:38:49.511199   86623 request.go:1022] Response Body: {"kind":"Table","apiVersion":"meta.k8s.io/v1","metadata":{"selfLink":"/api/v1/namespaces/namespace-1579023526-29860/pods","resourceVersion":"3079"},"columnDefinitions":[{"name":"Name","type":"string","format":"name","description":"Name must be unique within a namespace. Is required when creating resources, although some resources may allow a client to request the generation of an appropriate name automatically. Name is primarily intended for creation idempotence and configuration definition. Cannot be updated. More info: http://kubernetes.io/docs/user-guide/identifiers#names","priority":0},{"name":"Ready","type":"string","format":"","description":"The aggregate readiness state of this pod for accepting traffic.","priority":0},{"name":"Status","type":"string","format":"","description":"The aggregate status of the containers in this pod.","priority":0},{"name":"Restarts","type":"integer","format":"","description":"The number of times the containers in this pod have been restarted.","priority":0},{"name":"Age"," [truncated 3569 chars]
NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:includeObject=Object
E0114 17:38:49.620179   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:279: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/sorted-pod1 created
E0114 17:38:50.265335   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(BE0114 17:38:50.373319   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod2 created
E0114 17:38:50.525594   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:50.621488   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(Bpod/sorted-pod3 created
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BSuccessful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
Successful
message:sorted-pod3:sorted-pod2:sorted-pod1:
has:sorted-pod3:sorted-pod2:sorted-pod1:
Successful
message:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
E0114 17:38:51.266610   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
E0114 17:38:51.375132   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:NAME:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
Successful
message:I0114 17:38:51.368998   86886 loader.go:375] Config loaded from file:  /tmp/tmp.VFlSiaN67c/.kube/config
I0114 17:38:51.379650   86886 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1579023526-29860/pods
... skipping 9 lines ...
NAME          AGE
sorted-pod2   1s
sorted-pod1   1s
sorted-pod3   1s
has not:Table
get.sh:325: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BE0114 17:38:51.526812   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "sorted-pod1" force deleted
pod "sorted-pod2" force deleted
pod "sorted-pod3" force deleted
E0114 17:38:51.622810   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:329: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_kubectl_all_namespace_tests
Running command: run_kubectl_all_namespace_tests

+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_all_namespace_tests
+++ [0114 17:38:51] Testing kubectl --all-namespace
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(Bget.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
E0114 17:38:52.267914   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 17:38:52.376305   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAMESPACE                    NAME        READY   STATUS    RESTARTS   AGE
namespace-1579023526-29860   valid-pod   0/1     Pending   0          0s
namespace/all-ns-test-1 created
E0114 17:38:52.527801   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount/test created
E0114 17:38:52.624050   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/all-ns-test-2 created
serviceaccount/test created
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
all-ns-test-1                default   0         0s
all-ns-test-1                test      0         0s
... skipping 115 lines ...
namespace-1579023515-10948   default   0         17s
namespace-1579023517-2079    default   0         14s
namespace-1579023526-29860   default   0         6s
some-other-random            default   0         7s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
E0114 17:38:53.269182   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:53.377818   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:53.529075   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:53.625282   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:54.270587   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:54.379026   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:54.530400   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:54.626722   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:55.271880   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:55.380263   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:55.531719   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:55.628025   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:56.273289   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:56.381325   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:56.532935   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:56.629152   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:57.274605   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:57.382903   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:57.534403   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:57.630650   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:58.275899   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E0114 17:38:58.383939   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:58.535740   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:58.632007   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:59.277177   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:59.385278   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:59.536966   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:38:59.633402   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:00.278765   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:00.386373   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:00.538404   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:00.634751   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:01.280025   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:01.387638   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:01.539856   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:01.635872   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:02.281348   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:02.388906   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:02.541210   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:02.637178   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 17:39:03.183411   54630 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E0114 17:39:03.283051   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:03.389742   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 17:39:03.542442   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E0114 17:39:03.638739   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
message:NAME        STATUS     ROLES    AGE     VERSION
127.0.0.1   NotReady   <none>   4m53s   
has not:NAMESPACE
... skipping 5 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_template_output_tests
+++ [0114 17:39:04] Testing --template support on commands
+++ [0114 17:39:04] Creating namespace namespace-1579023544-23478
namespace/namespace-1579023544-23478 created
Context "test" modified.
E0114 17:39:04.284423   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 17:39:04.390995   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 17:39:04.543699   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "v1",
            "kind": "Pod",
... skipping 45 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 17:39:04.640090   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:35: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0114 17:39:05.285780   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:05.392466   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:scale-1:
has:scale-1:
E0114 17:39:05.544872   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:redis-slave:
has:redis-slave:
E0114 17:39:05.641403   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
Successful
message:nginx:
has:nginx:
kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
... skipping 7 lines ...
replicationcontroller/cassandra created
I0114 17:39:06.155451   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023544-23478", Name:"cassandra", UID:"fd9bf6ee-3266-4eab-8b9e-356614be5452", APIVersion:"v1", ResourceVersion:"3133", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-kqxp9
I0114 17:39:06.159185   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023544-23478", Name:"cassandra", UID:"fd9bf6ee-3266-4eab-8b9e-356614be5452", APIVersion:"v1", ResourceVersion:"3133", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-726jt
Successful
message:cassandra:
has:cassandra:
E0114 17:39:06.286984   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
	reconciliation required create
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
... skipping 3 lines ...
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
Successful
message:testing-CR:testing-CRB:testing-RB:testing-R:
has:testing-CR:testing-CRB:testing-RB:testing-R:
E0114 17:39:06.393762   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:myclusterrole:
has:myclusterrole:
E0114 17:39:06.546021   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0114 17:39:06.642615   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:cm:
has:cm:
I0114 17:39:06.739704   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023544-23478", Name:"deploy", UID:"b4e9bc7b-d674-419d-ab79-7e1d917f6016", APIVersion:"apps/v1", ResourceVersion:"3144", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
I0114 17:39:06.743429   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023544-23478", Name:"deploy-74bcc58696", UID:"a4e69122-946e-4f9e-8496-5413e365c2d5", APIVersion:"apps/v1", ResourceVersion:"3145", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-vv54w
Successful
... skipping 6 lines ...
Successful
message:bar:
has:bar:
Successful
message:foo:
has:foo:
E0114 17:39:07.288291   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:myrole:
has:myrole:
E0114 17:39:07.394889   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0114 17:39:07.547378   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0114 17:39:07.643929   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:valid-pod:
has:valid-pod:
... skipping 9 lines ...
Successful
message:valid-pod:
has:valid-pod:
Successful
message:foo:
has:foo:
E0114 17:39:08.289586   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
I0114 17:39:08.386513   54630 namespace_controller.go:185] Namespace has been deleted all-ns-test-2
E0114 17:39:08.395985   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0114 17:39:08.548593   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0114 17:39:08.644904   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 24 lines ...
Successful
message:deploy:
has:deploy:
Successful
message:deploy:
has:deploy:
E0114 17:39:09.291134   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0114 17:39:09.397216   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Config:
has:Config
Successful
message:apiVersion: v1
kind: ConfigMap
metadata:
  creationTimestamp: null
  name: cm
has:kind: ConfigMap
E0114 17:39:09.549972   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "pi" deleted
E0114 17:39:09.646109   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "cassandra-726jt" deleted
I0114 17:39:09.735312   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023544-23478", Name:"cassandra", UID:"fd9bf6ee-3266-4eab-8b9e-356614be5452", APIVersion:"v1", ResourceVersion:"3139", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-jm87b
pod "cassandra-kqxp9" deleted
I0114 17:39:09.745117   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579023544-23478", Name:"cassandra", UID:"fd9bf6ee-3266-4eab-8b9e-356614be5452", APIVersion:"v1", ResourceVersion:"3139", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-j42kb
I0114 17:39:09.751858   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023544-23478", Name:"deploy-74bcc58696", UID:"a4e69122-946e-4f9e-8496-5413e365c2d5", APIVersion:"apps/v1", ResourceVersion:"3152", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-lccz2
pod "deploy-74bcc58696-vv54w" deleted
... skipping 4 lines ...
deployment.apps "deploy" deleted
+++ exit code: 0
Recording: run_certificates_tests
Running command: run_certificates_tests

+++ Running case: test-cmd.run_certificates_tests 
E0114 17:39:10.292368   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
+++ [0114 17:39:10] Testing certificates
E0114 17:39:10.398466   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0114 17:39:10.551234   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0114 17:39:10.647332   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
... skipping 53 lines ...
        "selfLink": ""
    }
}
certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0114 17:39:11.293729   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0114 17:39:11.399826   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
E0114 17:39:11.552508   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 31 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 17:39:11.648533   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0114 17:39:12.295186   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo denied
E0114 17:39:12.401162   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 31 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 17:39:12.553873   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 17:39:12.649628   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
... skipping 53 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 17:39:13.296468   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:57: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(BE0114 17:39:13.402706   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 17:39:13.555237   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:59: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(B+++ exit code: 0
Recording: run_cluster_management_tests
Running command: run_cluster_management_tests
E0114 17:39:13.650929   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_cluster_management_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
+++ [0114 17:39:13] Testing cluster-management commands
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bpod/test-pod-1 created
pod/test-pod-2 created
E0114 17:39:14.297688   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
E0114 17:39:14.403984   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(BE0114 17:39:14.556317   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
E0114 17:39:14.652096   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
E0114 17:39:15.299022   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bnode-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 17:39:15.405128   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 17:39:15.557806   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(BE0114 17:39:15.653353   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned
node/127.0.0.1 drained
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(Bpod "test-pod-2" deleted
node/127.0.0.1 uncordoned
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 17:39:16.300315   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
E0114 17:39:16.406729   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
E0114 17:39:16.559046   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BE0114 17:39:16.654769   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 2 lines ...
has:node/127.0.0.1 cordoned
Successful
message:
has not:cordoned
node-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(B+++ exit code: 0
E0114 17:39:17.301565   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_plugins_tests
Running command: run_plugins_tests

+++ Running case: test-cmd.run_plugins_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_plugins_tests
+++ [0114 17:39:17] Testing kubectl plugins
E0114 17:39:17.408001   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
E0114 17:39:17.560406   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
E0114 17:39:17.656073   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 10 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0114 17:39:18] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
E0114 17:39:18.302990   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0114 17:39:18.409272   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(BE0114 17:39:18.561735   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(BE0114 17:39:18.657337   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(Bauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
+++ exit code: 0
E0114 17:39:19.304274   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_wait_tests
Running command: run_wait_tests

+++ Running case: test-cmd.run_wait_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_wait_tests
+++ [0114 17:39:19] Testing kubectl wait
+++ [0114 17:39:19] Creating namespace namespace-1579023559-8803
E0114 17:39:19.410652   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579023559-8803 created
Context "test" modified.
E0114 17:39:19.563068   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-1 created
I0114 17:39:19.634883   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023559-8803", Name:"test-1", UID:"7e768c4b-48a0-47f7-ba95-54a17e858a8e", APIVersion:"apps/v1", ResourceVersion:"3237", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
I0114 17:39:19.643362   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023559-8803", Name:"test-1-6d98955cc9", UID:"255dc15d-9c36-47de-bea5-362b10790d33", APIVersion:"apps/v1", ResourceVersion:"3238", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-lbcx7
E0114 17:39:19.658561   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-2 created
I0114 17:39:19.742903   54630 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579023559-8803", Name:"test-2", UID:"a73a9abd-9807-43e3-93dd-97ce08489e76", APIVersion:"apps/v1", ResourceVersion:"3247", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
I0114 17:39:19.746547   54630 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579023559-8803", Name:"test-2-65897ff84d", UID:"70588ad6-7819-4cd0-8fc6-18a015c1a5b9", APIVersion:"apps/v1", ResourceVersion:"3248", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-tqgnp
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE0114 17:39:20.305856   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:20.412044   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:20.564424   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:20.659813   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:21.307316   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:21.413374   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:21.565761   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 17:39:21.661024   54630 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 27 lines ...
I0114 17:39:22.285954   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.285954   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.285971   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286101   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286343   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286385   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.286445   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.286518   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.286444   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.286565   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286606   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286631   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286653   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286673   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286725   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286818   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286906   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.286906   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287054   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.287064   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.287080   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.287066   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.287105   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.287150   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.287204   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.287263   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287263   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287284   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287285   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287302   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.287312   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.287329   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.287350   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.287369   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287378   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287398   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287431   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287442   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.287456   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.287472   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287477   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287475   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.287528   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.287560   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287581   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287685   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.287743   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.287797   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.287858   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287876   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287530   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.287921   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288008   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.288034   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.288038   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288107   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288149   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288171   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288182   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288299   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.288302   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288310   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288303   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288330   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288316   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288353   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288362   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288369   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288386   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288411   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288417   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288423   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288438   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288464   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288474   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288481   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288489   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288506   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288516   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.288524   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.288527   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288550   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288533   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288561   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288567   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288569   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.288466   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288623   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288665   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288686   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288696   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288712   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288722   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.288745   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.288780   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288790   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288818   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.288886   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.288891   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.288955   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289055   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.289080   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 17:39:22.289086   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.289096   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289118   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.289138   51184 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 17:39:22.289171   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289172   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289203   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289243   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289256   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289258   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289256   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289289   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289295   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289299   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289258   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289313   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289320   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289339   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289352   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289356   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289360   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289375   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289376   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:22.289390   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 17:39:22.289591   51184 secure_serving.go:222] Stopped listening on 127.0.0.1:8080
junit report dir: /logs/artifacts
+++ [0114 17:39:22] Clean up complete
+ make test-integration
W0114 17:39:23.286966   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.286966   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.286970   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287473   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287489   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287472   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287506   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287529   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287596   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287670   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.287744   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288076   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288098   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288150   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288389   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288819   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288840   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288850   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288819   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288869   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288896   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288919   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288924   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288938   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288943   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.288981   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289010   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289024   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289059   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289066   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289114   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289121   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289144   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289147   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289226   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289239   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289287   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.289315   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290021   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290074   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290083   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290087   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290116   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290089   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290125   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290091   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290171   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290241   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290171   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290267   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290304   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290312   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290327   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290341   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290341   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290358   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290366   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290391   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290397   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290404   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290411   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290080   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290424   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290441   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290445   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:23.290610   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.579360   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.590290   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.619281   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.620176   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.620710   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.626569   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.632588   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.649426   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.652869   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.652997   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.661825   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.672660   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.672824   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.676764   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.680653   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.696920   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.700314   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.701414   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.707550   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.715676   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.716780   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.745687   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.751338   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.766309   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.772408   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.791376   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.810343   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.812117   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.848464   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.852727   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.895141   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.898978   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.900174   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.903083   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.910118   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.911610   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.911839   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.911973   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.917724   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.926284   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.926566   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.937602   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.939965   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.949262   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.959968   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.964717   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.970322   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.970371   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.972236   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.973576   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:24.974538   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.033860   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.048079   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.049675   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.067955   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.085028   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.085953   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.096859   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.112930   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.121044   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.138637   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.147830   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.147830   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.156515   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.175238   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:25.179590   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:26.762725   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:26.801062   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:26.827709   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:26.941651   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:26.961207   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:26.961581   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 17:39:26.985500   51184 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://12