This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 2610 succeeded
Started2020-01-13 10:21
Elapsed26m42s
Revisionmaster
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/b4911dbe-2933-455c-b899-ebd2501576dd/targets/test'}}
resultstorehttps://source.cloud.google.com/results/invocations/b4911dbe-2933-455c-b899-ebd2501576dd/targets/test

Test Failures


k8s.io/kubernetes/test/integration/client TestDynamicClient 7.13s

go test -v k8s.io/kubernetes/test/integration/client -run TestDynamicClient$
=== RUN   TestDynamicClient
I0113 10:38:55.406292  106155 controller.go:123] Shutting down OpenAPI controller
I0113 10:38:55.406340  106155 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0113 10:38:55.406363  106155 autoregister_controller.go:164] Shutting down autoregister controller
I0113 10:38:55.406382  106155 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
I0113 10:38:55.406398  106155 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0113 10:38:55.406414  106155 establishing_controller.go:85] Shutting down EstablishingController
I0113 10:38:55.406427  106155 naming_controller.go:300] Shutting down NamingConditionController
I0113 10:38:55.406434  106155 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver037197015/proxy-ca.crt
I0113 10:38:55.406439  106155 customresource_discovery_controller.go:220] Shutting down DiscoveryController
I0113 10:38:55.406452  106155 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver037197015/client-ca.crt
I0113 10:38:55.406464  106155 crd_finalizer.go:276] Shutting down CRDFinalizer
I0113 10:38:55.406458  106155 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0113 10:38:55.406471  106155 apiservice_controller.go:106] Shutting down APIServiceRegistrationController
I0113 10:38:55.406479  106155 controller.go:87] Shutting down OpenAPI AggregationController
I0113 10:38:55.406468  106155 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver037197015/proxy-ca.crt
I0113 10:38:55.406496  106155 available_controller.go:398] Shutting down AvailableConditionController
I0113 10:38:55.406581  106155 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0113 10:38:55.406604  106155 secure_serving.go:222] Stopped listening on 127.0.0.1:42937
I0113 10:38:55.406615  106155 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver037197015/client-ca.crt
I0113 10:38:55.406573  106155 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver037197015/apiserver.crt::/tmp/kubernetes-kube-apiserver037197015/apiserver.key
I0113 10:38:56.413533  106155 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver961258570/apiserver.crt, /tmp/kubernetes-kube-apiserver961258570/apiserver.key)
I0113 10:38:56.413558  106155 server.go:596] external host was not specified, using 127.0.0.1
W0113 10:38:56.413569  106155 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0113 10:38:56.712045  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.712083  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.712098  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.712303  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713101  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713144  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713171  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713197  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713399  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713574  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713621  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:38:56.713689  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 10:38:56.713708  106155 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0113 10:38:56.713726  106155 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0113 10:38:56.714846  106155 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0113 10:38:56.714865  106155 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0113 10:38:56.716446  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.716484  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.717581  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.717640  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0113 10:38:56.745103  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 10:38:56.746065  106155 master.go:264] Using reconciler: lease
I0113 10:38:56.746303  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.746329  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.748520  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.748551  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.749569  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.749607  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.750532  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.750570  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.751692  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.751853  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.752866  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.752892  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.754048  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.754084  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.755132  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.755156  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.756741  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.756778  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.757766  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.757794  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.759067  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.759096  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.760157  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.760206  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.761187  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.761219  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.762272  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.762311  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.763196  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.763228  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.764250  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.764280  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.765090  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.765122  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.765974  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.766001  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.766733  106155 rest.go:113] the default service ipfamily for this cluster is: IPv4
I0113 10:38:56.883476  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.883550  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.884785  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.884822  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.886956  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.886996  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.888100  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.888139  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.889168  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.889200  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.890397  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.890433  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.891548  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.891576  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.892742  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.892775  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.893790  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.893821  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.894763  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.894802  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.896159  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.896198  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.897040  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.897063  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.898030  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.898065  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.899203  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.899236  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.900557  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.900591  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.901504  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.901537  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.902567  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.902601  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.903521  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.903555  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.904544  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.904572  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.905491  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.905519  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.906445  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.906481  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.907528  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.907554  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.908985  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.909015  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.910008  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.910059  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.910784  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.910814  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.911865  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.911903  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.912617  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.912646  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.913483  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.913513  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.914222  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.914324  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.915524  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.915597  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.918253  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.918291  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.919214  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.919240  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.920422  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.920456  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.921562  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.921592  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.922613  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.922645  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.923683  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.923713  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.924754  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.924787  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.925568  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.925682  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.926552  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.926587  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.927567  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.928004  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.928825  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.928862  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.929813  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.929936  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.930919  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.930953  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.932110  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.932138  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.933234  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.933278  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.934134  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.934166  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.934973  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.935009  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.936138  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.936558  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.937559  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.937713  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.939191  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.939221  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.940604  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.940637  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.941511  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.941545  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.942363  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.942488  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:56.943344  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:56.943480  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
E0113 10:38:57.105776  106155 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
W0113 10:38:57.144110  106155 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
W0113 10:38:57.352506  106155 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0113 10:38:57.352622  106155 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0113 10:38:57.385602  106155 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0113 10:38:57.385677  106155 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
W0113 10:38:57.387369  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 10:38:57.387762  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:57.387873  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:38:57.388994  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:57.389030  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0113 10:38:57.392098  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 10:38:57.392750  106155 aggregator.go:182] Skipping APIService creation for flowcontrol.apiserver.k8s.io/v1alpha1
I0113 10:38:57.711518  106155 client.go:361] parsed scheme: "endpoint"
I0113 10:38:57.711605  106155 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0113 10:39:00.407353  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1beta1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 10:39:00.407441  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ValidatingWebhookConfiguration ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 10:39:00.407513  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ResourceQuota ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 10:39:00.407648  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Secret ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 10:39:00.407885  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ServiceAccount ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 10:39:00.408006  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.LimitRange ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 10:39:00.408101  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.StorageClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 10:39:00.408167  106155 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.PriorityClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
I0113 10:39:01.361387  106155 secure_serving.go:178] Serving securely on 127.0.0.1:34955
I0113 10:39:01.361492  106155 crd_finalizer.go:264] Starting CRDFinalizer
I0113 10:39:01.361570  106155 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver961258570/proxy-ca.crt
I0113 10:39:01.361605  106155 dynamic_serving_content.go:129] Starting serving-cert::/tmp/kubernetes-kube-apiserver961258570/apiserver.crt::/tmp/kubernetes-kube-apiserver961258570/apiserver.key
I0113 10:39:01.361636  106155 apiservice_controller.go:94] Starting APIServiceRegistrationController
I0113 10:39:01.361644  106155 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0113 10:39:01.361663  106155 available_controller.go:386] Starting AvailableConditionController
I0113 10:39:01.361669  106155 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0113 10:39:01.361684  106155 controller.go:81] Starting OpenAPI AggregationController
I0113 10:39:01.361702  106155 controller.go:86] Starting OpenAPI controller
I0113 10:39:01.361728  106155 customresource_discovery_controller.go:209] Starting DiscoveryController
I0113 10:39:01.361745  106155 naming_controller.go:289] Starting NamingConditionController
I0113 10:39:01.361764  106155 establishing_controller.go:74] Starting EstablishingController
I0113 10:39:01.361782  106155 nonstructuralschema_controller.go:185] Starting NonStructuralSchemaConditionController
I0113 10:39:01.361799  106155 apiapproval_controller.go:184] Starting KubernetesAPIApprovalPolicyConformantConditionController
I0113 10:39:01.361818  106155 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver961258570/client-ca.crt
I0113 10:39:01.362578  106155 tlsconfig.go:241] Starting DynamicServingCertificateController
I0113 10:39:01.362921  106155 autoregister_controller.go:140] Starting autoregister controller
I0113 10:39:01.362939  106155 cache.go:32] Waiting for caches to sync for autoregister controller
I0113 10:39:01.363045  106155 crdregistration_controller.go:111] Starting crd-autoregister controller
I0113 10:39:01.363063  106155 shared_informer.go:206] Waiting for caches to sync for crd-autoregister
W0113 10:39:01.363881  106155 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 10:39:01.364050  106155 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0113 10:39:01.364059  106155 shared_informer.go:206] Waiting for caches to sync for cluster_authentication_trust_controller
I0113 10:39:01.376797  106155 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver961258570/client-ca.crt
I0113 10:39:01.378145  106155 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver961258570/proxy-ca.crt
E0113 10:39:01.382714  106155 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /0281ed16-78c4-48dd-a0c5-144bad494997/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
E0113 10:39:01.384048  106155 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 10:39:01.402035  106155 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 10:39:01.435039  106155 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0113 10:39:01.461854  106155 cache.go:39] Caches are synced for AvailableConditionController controller
I0113 10:39:01.461831  106155 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0113 10:39:01.463097  106155 cache.go:39] Caches are synced for autoregister controller
I0113 10:39:01.463216  106155 shared_informer.go:213] Caches are synced for crd-autoregister 
I0113 10:39:01.464366  106155 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0113 10:39:02.360479  106155 controller.go:107] OpenAPI AggregationController: Processing item 
I0113 10:39:02.360516  106155 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0113 10:39:02.360532  106155 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0113 10:39:02.374951  106155 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I0113 10:39:02.381840  106155 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I0113 10:39:02.381866  106155 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
W0113 10:39:02.437532  106155 lease.go:224] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E0113 10:39:02.439406  106155 controller.go:222] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W0113 10:39:02.520257  106155 cacher.go:162] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0113 10:39:02.520875  106155 cacher.go:162] Terminating all watchers from cacher *core.LimitRange
W0113 10:39:02.521170  106155 cacher.go:162] Terminating all watchers from cacher *core.ResourceQuota
W0113 10:39:02.521282  106155 cacher.go:162] Terminating all watchers from cacher *core.Secret
W0113 10:39:02.521785  106155 cacher.go:162] Terminating all watchers from cacher *core.ConfigMap
W0113 10:39:02.522412  106155 cacher.go:162] Terminating all watchers from cacher *core.Namespace
W0113 10:39:02.522462  106155 cacher.go:162] Terminating all watchers from cacher *core.Endpoints
W0113 10:39:02.522881  106155 cacher.go:162] Terminating all watchers from cacher *core.Pod
W0113 10:39:02.523115  106155 cacher.go:162] Terminating all watchers from cacher *core.ServiceAccount
W0113 10:39:02.523395  106155 cacher.go:162] Terminating all watchers from cacher *core.Service
W0113 10:39:02.525652  106155 cacher.go:162] Terminating all watchers from cacher *node.RuntimeClass
W0113 10:39:02.529063  106155 cacher.go:162] Terminating all watchers from cacher *scheduling.PriorityClass
W0113 10:39:02.529978  106155 cacher.go:162] Terminating all watchers from cacher *storage.StorageClass
W0113 10:39:02.531492  106155 cacher.go:162] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0113 10:39:02.531850  106155 cacher.go:162] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0113 10:39:02.532323  106155 cacher.go:162] Terminating all watchers from cacher *apiregistration.APIService
--- FAIL: TestDynamicClient (7.13s)
I0113 10:39:02.532896  106155 controller.go:87] Shutting down OpenAPI AggregationController
I0113 10:39:02.532926  106155 controller.go:123] Shutting down OpenAPI controller
I0113 10:39:02.532950  106155 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0113 10:39:02.532967  106155 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0113 10:39:02.532983  106155 autoregister_controller.go:164] Shutting down autoregister controller
I0113 10:39:02.532999  106155 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
    testserver.go:181: runtime-config=map[api/all:true]
    testserver.go:182: Starting kube-apiserver on port 34955...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testzrd7z", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testzrd7z", UID:"072dc128-1612-4874-9a06-9d29564f39a9", ResourceVersion:"8396", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714508742, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc0085fa320), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0085fa340)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03ffcd348), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc01c1b3080), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ffcd370)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ffcd390)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03ffcd398), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03ffcd39c), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testzrd7z", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testzrd7z", UID:"072dc128-1612-4874-9a06-9d29564f39a9", ResourceVersion:"8396", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714508742, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc00863ed00), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00863ece0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03b4e4678), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc01c205200), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03b4e4700)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03b4e4760)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03b4e4648), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03b4e4609), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}
I0113 10:39:02.533026  106155 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0113 10:39:02.533044  106155 establishing_controller.go:85] Shutting down EstablishingController

				from junit_da39a3ee5e6b4b0d3255bfef95601890afd80709_20200113-103629.xml

Find in mentions in log files | View test history on testgrid


Show 2610 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 55 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [0113 10:25:58] Call tree:
!!! [0113 10:25:58]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0113 10:25:58]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0113 10:25:58]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [0113 10:25:58]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [0113 10:25:58]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0113 10:25:58] Running kubeadm tests
+++ [0113 10:26:03] Building go targets for linux/amd64:
    cmd/kubeadm
hack/make-rules/test.sh: line 191: KUBE_TEST_API: unbound variable
+++ [0113 10:26:54] Running tests without code coverage
{"Time":"2020-01-13T10:28:21.229423759Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t45.439s\n"}
... skipping 303 lines ...
+++ [0113 10:30:10] Building kube-controller-manager
+++ [0113 10:30:15] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [0113 10:30:46] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I0113 10:30:46.854611   54599 serving.go:313] Generated self-signed cert in-memory
W0113 10:30:47.430087   54599 authentication.go:409] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0113 10:30:47.430138   54599 authentication.go:267] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0113 10:30:47.430151   54599 authentication.go:291] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0113 10:30:47.430163   54599 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0113 10:30:47.430176   54599 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0113 10:30:47.430194   54599 controllermanager.go:161] Version: v1.18.0-alpha.1.634+36e40fb8502930
I0113 10:30:47.431347   54599 secure_serving.go:178] Serving securely on [::]:10257
I0113 10:30:47.431394   54599 tlsconfig.go:241] Starting DynamicServingCertificateController
I0113 10:30:47.431808   54599 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0113 10:30:47.431944   54599 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 116 lines ...
W0113 10:30:48.477916   54599 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 10:30:48.478064   54599 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 10:30:48.478261   54599 garbagecollector.go:129] Starting garbage collector controller
I0113 10:30:48.478288   54599 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0113 10:30:48.478306   54599 controllermanager.go:533] Started "garbagecollector"
I0113 10:30:48.478330   54599 graph_builder.go:282] GraphBuilder running
E0113 10:30:48.478785   54599 core.go:90] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0113 10:30:48.478845   54599 controllermanager.go:525] Skipping "service"
I0113 10:30:48.479221   54599 controllermanager.go:533] Started "clusterrole-aggregation"
W0113 10:30:48.479248   54599 controllermanager.go:525] Skipping "root-ca-cert-publisher"
I0113 10:30:48.479373   54599 clusterroleaggregation_controller.go:148] Starting ClusterRoleAggregator
I0113 10:30:48.479415   54599 shared_informer.go:206] Waiting for caches to sync for ClusterRoleAggregator
+++ [0113 10:30:48] Testing kubectl version
... skipping 28 lines ...
W0113 10:30:48.485017   54599 controllermanager.go:525] Skipping "endpointslice"
I0113 10:30:48.485127   54599 replica_set.go:180] Starting replicationcontroller controller
I0113 10:30:48.485155   54599 shared_informer.go:206] Waiting for caches to sync for ReplicationController
I0113 10:30:48.485236   54599 controllermanager.go:533] Started "csrcleaner"
I0113 10:30:48.485332   54599 cleaner.go:81] Starting CSR cleaner controller
I0113 10:30:48.485516   54599 node_lifecycle_controller.go:77] Sending events to api server
E0113 10:30:48.485549   54599 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0113 10:30:48.485558   54599 controllermanager.go:525] Skipping "cloud-node-lifecycle"
W0113 10:30:48.486823   54599 probe.go:268] Flexvolume plugin directory at /usr/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating.
I0113 10:30:48.487954   54599 controllermanager.go:533] Started "attachdetach"
I0113 10:30:48.488054   54599 attach_detach_controller.go:342] Starting attach detach controller
I0113 10:30:48.488070   54599 shared_informer.go:206] Waiting for caches to sync for attach detach
I0113 10:30:48.488432   54599 controllermanager.go:533] Started "persistentvolume-expander"
... skipping 5 lines ...
I0113 10:30:48.489404   54599 controllermanager.go:533] Started "deployment"
I0113 10:30:48.489606   54599 deployment_controller.go:152] Starting deployment controller
I0113 10:30:48.489629   54599 shared_informer.go:206] Waiting for caches to sync for deployment
I0113 10:30:48.490906   54599 controllermanager.go:533] Started "pv-protection"
I0113 10:30:48.491952   54599 pv_protection_controller.go:81] Starting PV protection controller
I0113 10:30:48.491983   54599 shared_informer.go:206] Waiting for caches to sync for PV protection
W0113 10:30:48.544538   54599 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I0113 10:30:48.571434   54599 shared_informer.go:213] Caches are synced for namespace 
I0113 10:30:48.579716   54599 shared_informer.go:213] Caches are synced for ClusterRoleAggregator 
I0113 10:30:48.580632   54599 shared_informer.go:213] Caches are synced for job 
I0113 10:30:48.581509   54599 shared_informer.go:213] Caches are synced for stateful set 
I0113 10:30:48.582124   54599 shared_informer.go:213] Caches are synced for certificate-csrapproving 
I0113 10:30:48.583516   54599 shared_informer.go:213] Caches are synced for HPA 
I0113 10:30:48.585351   54599 shared_informer.go:213] Caches are synced for ReplicationController 
E0113 10:30:48.588439   54599 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
E0113 10:30:48.588440   54599 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
I0113 10:30:48.588805   54599 shared_informer.go:213] Caches are synced for expand 
I0113 10:30:48.589366   54599 shared_informer.go:213] Caches are synced for daemon sets 
I0113 10:30:48.592503   54599 shared_informer.go:213] Caches are synced for PV protection 
{
  "major": "1",
  "minor": "18+",
  "gitVersion": "v1.18.0-alpha.1.634+36e40fb8502930",
  "gitCommit": "36e40fb850293076b415ae3d376f5f81dc897105",
  "gitTreeState": "clean",
  "buildDate": "2020-01-12T23:07:37Z",
  "goVersion": "go1.13.5",
  "compiler": "gc",
  "platform": "linux/amd64"
}E0113 10:30:48.596660   54599 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
E0113 10:30:48.598228   54599 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
I0113 10:30:48.605349   54599 shared_informer.go:213] Caches are synced for TTL 
I0113 10:30:48.605952   54599 shared_informer.go:213] Caches are synced for taint 
I0113 10:30:48.606085   54599 taint_manager.go:186] Starting NoExecuteTaintManager
I0113 10:30:48.606114   54599 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
I0113 10:30:48.606305   54599 node_lifecycle_controller.go:1209] Controller detected that all Nodes are not-Ready. Entering master disruption mode.
I0113 10:30:48.606365   54599 event.go:278] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"127.0.0.1", UID:"e516cbfc-12fc-4edb-8a6d-911c011a9a51", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node 127.0.0.1 event: Registered Node 127.0.0.1 in Controller
... skipping 74 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0113 10:30:52] Creating namespace namespace-1578911452-12383
namespace/namespace-1578911452-12383 created
Context "test" modified.
+++ [0113 10:30:52] Testing RESTMapper
+++ [0113 10:30:53] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 650 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0113 10:31:37] "kubectl patch with resourceVersion 530" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
W0113 10:31:38.560482   54599 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test created
I0113 10:31:38.609844   54599 event.go:278] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"node-v1-test", UID:"75965bd5-de37-43dd-902c-9b38ada2e3e0", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node node-v1-test event: Registered Node node-v1-test in Controller
node/node-v1-test replaced
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
... skipping 24 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 86 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0113 10:31:50] Creating namespace namespace-1578911510-32188
namespace/namespace-1578911510-32188 created
Context "test" modified.
+++ [0113 10:31:50] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0113 10:31:51] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I0113 10:31:54.636849   51163 client.go:361] parsed scheme: "endpoint"
I0113 10:31:54.636894   51163 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:31:54.641206   51163 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 102 lines ...
Context "test" modified.
+++ [0113 10:31:57] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 30 lines ...
I0113 10:32:01.417423   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-8484dd655", UID:"4ee2743d-2897-4b8d-9eb8-0351aa065388", APIVersion:"apps/v1", ResourceVersion:"630", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-mg8cf
I0113 10:32:01.420752   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-8484dd655", UID:"4ee2743d-2897-4b8d-9eb8-0351aa065388", APIVersion:"apps/v1", ResourceVersion:"630", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-rxndv
I0113 10:32:01.430020   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-8484dd655", UID:"4ee2743d-2897-4b8d-9eb8-0351aa065388", APIVersion:"apps/v1", ResourceVersion:"630", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-m5ft4
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BI0113 10:32:04.969922   54599 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578911507-16427
Successful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1578911518-6526\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1578911518-6526"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
deployment.apps/nginx configured
I0113 10:32:11.121263   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911518-6526", Name:"nginx", UID:"ebe84801-20a9-4be9-ad9c-a842021597a9", APIVersion:"apps/v1", ResourceVersion:"671", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0113 10:32:11.123424   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-668b6c7744", UID:"430b4eb5-443c-41ed-ba99-61561c41b471", APIVersion:"apps/v1", ResourceVersion:"672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-5v7gt
I0113 10:32:11.132886   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-668b6c7744", UID:"430b4eb5-443c-41ed-ba99-61561c41b471", APIVersion:"apps/v1", ResourceVersion:"672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-9kpc7
I0113 10:32:11.133238   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-668b6c7744", UID:"430b4eb5-443c-41ed-ba99-61561c41b471", APIVersion:"apps/v1", ResourceVersion:"672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-k4g4j
Successful
message:        "name": "nginx2"
          "name": "nginx2"
has:"name": "nginx2"
E0113 10:32:15.542885   54599 replica_set.go:534] sync "namespace-1578911518-6526/nginx-668b6c7744" failed with Operation cannot be fulfilled on replicasets.apps "nginx-668b6c7744": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1578911518-6526/nginx-668b6c7744, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 430b4eb5-443c-41ed-ba99-61561c41b471, UID in object meta: 
I0113 10:32:16.515249   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911518-6526", Name:"nginx", UID:"9e6d9c1f-52ec-4c18-97b6-d0340057275a", APIVersion:"apps/v1", ResourceVersion:"706", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0113 10:32:16.520324   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-668b6c7744", UID:"2be509b1-4671-46ba-a191-2e71a42d309b", APIVersion:"apps/v1", ResourceVersion:"707", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-9bl9s
I0113 10:32:16.524993   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-668b6c7744", UID:"2be509b1-4671-46ba-a191-2e71a42d309b", APIVersion:"apps/v1", ResourceVersion:"707", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-z2vw7
I0113 10:32:16.531900   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911518-6526", Name:"nginx-668b6c7744", UID:"2be509b1-4671-46ba-a191-2e71a42d309b", APIVersion:"apps/v1", ResourceVersion:"707", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-7qdkm
Successful
message:The Deployment "nginx" is invalid: spec.template.metadata.labels: Invalid value: map[string]string{"name":"nginx3"}: `selector` does not match template `labels`
... skipping 160 lines ...
+++ [0113 10:32:18] Creating namespace namespace-1578911538-19809
namespace/namespace-1578911538-19809 created
Context "test" modified.
+++ [0113 10:32:19] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1578911538-19809 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1578911538-19809 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0113 10:32:21.571486   65088 loader.go:375] Config loaded from file:  /tmp/tmp.TJfwolnO0m/.kube/config
I0113 10:32:21.573085   65088 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 0 milliseconds
I0113 10:32:21.608568   65088 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
I0113 10:32:21.610463   65088 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 479 lines ...
Successful
message:NAME    DATA   AGE
one     0      0s
three   0      0s
two     0      0s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [0113 10:32:28] Creating namespace namespace-1578911548-19528
namespace/namespace-1578911548-19528 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 105 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-01-13T10:32:29Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl", "operation":"Update", "time":"2020-01-13T10:32:29Z"}}, "name":"valid-pod", "namespace":"namespace-1578911548-19528", "resourceVersion":"755", "selfLink":"/api/v1/namespaces/namespace-1578911548-19528/pods/valid-pod", "uid":"fb7dbf5d-1853-4280-875d-a08aca84c0c4"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-01-13T10:32:29Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2020-01-13T10:32:29Z"}],"name":"valid-pod","namespace":"namespace-1578911548-19528","resourceVersion":"755","selfLink":"/api/v1/namespaces/namespace-1578911548-19528/pods/valid-pod","uid":"fb7dbf5d-1853-4280-875d-a08aca84c0c4"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-01-13T10:32:29Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl operation:Update time:2020-01-13T10:32:29Z]] name:valid-pod namespace:namespace-1578911548-19528 resourceVersion:755 selfLink:/api/v1/namespaces/namespace-1578911548-19528/pods/valid-pod uid:fb7dbf5d-1853-4280-875d-a08aca84c0c4] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 82 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 79 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [0113 10:32:35] Creating namespace namespace-1578911555-14024
namespace/namespace-1578911555-14024 created
Context "test" modified.
+++ [0113 10:32:35] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [0113 10:32:35] Creating namespace namespace-1578911555-4160
namespace/namespace-1578911555-4160 created
Context "test" modified.
+++ [0113 10:32:36] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0113 10:32:36.928082   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911555-4160", Name:"frontend", UID:"505d1ffa-7e86-4b3f-9dd3-bbef15e0f717", APIVersion:"apps/v1", ResourceVersion:"813", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ddmgv
I0113 10:32:36.937832   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911555-4160", Name:"frontend", UID:"505d1ffa-7e86-4b3f-9dd3-bbef15e0f717", APIVersion:"apps/v1", ResourceVersion:"813", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-c9sx5
I0113 10:32:36.937889   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911555-4160", Name:"frontend", UID:"505d1ffa-7e86-4b3f-9dd3-bbef15e0f717", APIVersion:"apps/v1", ResourceVersion:"813", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xxxg4
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-c9sx5 does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-c9sx5 does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"8aa49ba7-f40d-4fd1-9145-74b460388568","resourceVersion":"833","creationTimestamp":"2020-01-13T10:32:38Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"8aa49ba7-f40d-4fd1-9145-74b460388568","resourceVersion":"836","creationTimestamp":"2020-01-13T10:32:38Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"8aa49ba7-f40d-4fd1-9145-74b460388568"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 110 lines ...
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          2s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 170 lines ...
I0113 10:32:50.818002   54599 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for resources.mygroup.example.com
I0113 10:32:50.818253   54599 shared_informer.go:206] Waiting for caches to sync for resource quota
I0113 10:32:50.855922   51163 client.go:361] parsed scheme: "endpoint"
I0113 10:32:50.855975   51163 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 10:32:50.918559   54599 shared_informer.go:213] Caches are synced for resource quota 
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0113 10:32:51] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 182 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I0113 10:33:09.039202   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911588-15478", Name:"test1-6cdffdb5b8", UID:"720e2725-2b84-4309-961c-c40c4858840c", APIVersion:"apps/v1", ResourceVersion:"1002", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-v96rq
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
W0113 10:33:09.156593   51163 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 10:33:09.157690   54599 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
W0113 10:33:09.295221   51163 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 10:33:09.296555   54599 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 10:33:09] Testing recursive resources
+++ [0113 10:33:09] Creating namespace namespace-1578911589-21614
namespace/namespace-1578911589-21614 created
W0113 10:33:09.418603   51163 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 10:33:09.419401   54599 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
W0113 10:33:09.548021   51163 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 10:33:09.549331   54599 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E0113 10:33:10.158996   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:10.297886   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 10:33:10.420908   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:10.550465   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:11.160623   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Name:         busybox0
Namespace:    namespace-1578911589-21614
Priority:     0
Node:         <none>
Labels:       app=busybox0
... skipping 153 lines ...
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 10:33:11.299239   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:11.422250   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:11.551804   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E0113 10:33:12.161869   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:33:12.300546   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:12.423691   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I0113 10:33:12.512616   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911589-21614", Name:"nginx", UID:"96f572e0-b965-4ad0-adeb-c78aebd6a10e", APIVersion:"apps/v1", ResourceVersion:"1026", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0113 10:33:12.517990   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911589-21614", Name:"nginx-f87d999f7", UID:"965acfbf-aeed-45b9-b75f-98dbfba08f4e", APIVersion:"apps/v1", ResourceVersion:"1027", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-tgl9l
I0113 10:33:12.521245   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911589-21614", Name:"nginx-f87d999f7", UID:"965acfbf-aeed-45b9-b75f-98dbfba08f4e", APIVersion:"apps/v1", ResourceVersion:"1027", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-9r7qc
I0113 10:33:12.523371   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911589-21614", Name:"nginx-f87d999f7", UID:"965acfbf-aeed-45b9-b75f-98dbfba08f4e", APIVersion:"apps/v1", ResourceVersion:"1027", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-54rsj
E0113 10:33:12.552715   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bgeneric-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BI0113 10:33:12.811427   54599 namespace_controller.go:185] Namespace has been deleted non-native-resources
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
... skipping 38 lines ...
      schedulerName: default-scheduler
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:extensions/v1beta1
deployment.apps "nginx" deleted
E0113 10:33:13.163249   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:13.301872   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:13.425081   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 10:33:13.554031   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:14.164671   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0113 10:33:14.303136   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 10:33:14.426403   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:14.555326   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0113 10:33:15.047340   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911589-21614", Name:"busybox0", UID:"04a219b9-4045-48b0-a205-462a3f64d9f4", APIVersion:"v1", ResourceVersion:"1060", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-kcwpv
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0113 10:33:15.052797   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911589-21614", Name:"busybox1", UID:"0df00cf7-68d0-4902-b1aa-1b15157893fa", APIVersion:"v1", ResourceVersion:"1062", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-95ldr
E0113 10:33:15.165792   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:15.304386   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0113 10:33:15.428097   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0113 10:33:15.556819   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
E0113 10:33:16.166996   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 10:33:16.305562   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0113 10:33:16.429426   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:16.558140   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0113 10:33:17.168272   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:17.263074   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911589-21614", Name:"busybox0", UID:"04a219b9-4045-48b0-a205-462a3f64d9f4", APIVersion:"v1", ResourceVersion:"1082", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-vctfs
I0113 10:33:17.274215   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911589-21614", Name:"busybox1", UID:"0df00cf7-68d0-4902-b1aa-1b15157893fa", APIVersion:"v1", ResourceVersion:"1086", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-2rjn8
E0113 10:33:17.306978   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(BE0113 10:33:17.430681   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0113 10:33:17.559609   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx1-deployment created
I0113 10:33:18.152603   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911589-21614", Name:"nginx1-deployment", UID:"439c7ad1-c7eb-4d12-8beb-7c2893f1023a", APIVersion:"apps/v1", ResourceVersion:"1102", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0113 10:33:18.157655   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911589-21614", Name:"nginx0-deployment", UID:"64eba580-ab93-4e0c-af59-adcec61ca564", APIVersion:"apps/v1", ResourceVersion:"1104", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I0113 10:33:18.157682   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911589-21614", Name:"nginx1-deployment-7bdbbfb5cf", UID:"51f2aaff-0112-4942-a4a8-e5f7c40804a8", APIVersion:"apps/v1", ResourceVersion:"1103", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-zgtqn
E0113 10:33:18.170474   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:18.170550   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911589-21614", Name:"nginx0-deployment-57c6bff7f6", UID:"d108f849-3eff-4e56-89a4-be0b8b615ce1", APIVersion:"apps/v1", ResourceVersion:"1108", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-t2ltg
I0113 10:33:18.170585   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911589-21614", Name:"nginx1-deployment-7bdbbfb5cf", UID:"51f2aaff-0112-4942-a4a8-e5f7c40804a8", APIVersion:"apps/v1", ResourceVersion:"1103", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-s88bz
I0113 10:33:18.174832   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911589-21614", Name:"nginx0-deployment-57c6bff7f6", UID:"d108f849-3eff-4e56-89a4-be0b8b615ce1", APIVersion:"apps/v1", ResourceVersion:"1108", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-sxx6d
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(BE0113 10:33:18.308555   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BE0113 10:33:18.432258   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:18.560940   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment resumed
deployment.apps/nginx0-deployment resumed
generic-resources.sh:410: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0113 10:33:19.171989   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0113 10:33:19.309835   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0113 10:33:19.433504   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:19.562339   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:20.173332   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:20.311242   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:20.434917   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:33:20.563933   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/busybox0 created
I0113 10:33:20.745686   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911589-21614", Name:"busybox0", UID:"bdeca860-6459-4e3c-9ce1-fa084581052d", APIVersion:"v1", ResourceVersion:"1153", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-zslkz
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0113 10:33:20.752494   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911589-21614", Name:"busybox1", UID:"a37fe93d-bdb7-44c4-879f-252854e76fda", APIVersion:"v1", ResourceVersion:"1155", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-5dblq
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
... skipping 2 lines ...
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
I0113 10:33:21.170554   54599 shared_informer.go:206] Waiting for caches to sync for resource quota
I0113 10:33:21.170604   54599 shared_informer.go:213] Caches are synced for resource quota 
E0113 10:33:21.174616   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
E0113 10:33:21.312504   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0113 10:33:21.436328   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:21.565591   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:21.686969   54599 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0113 10:33:21.687045   54599 shared_informer.go:213] Caches are synced for garbage collector 
E0113 10:33:22.176230   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:22.313850   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests
E0113 10:33:22.437692   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0113 10:33:22] Testing kubectl(v1:namespaces)
E0113 10:33:22.567179   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace created
core.sh:1314: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
E0113 10:33:23.177558   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:23.315195   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:23.439106   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:23.568801   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:24.179145   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:24.316782   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:24.440798   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:24.570167   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:25.180512   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:25.318116   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:25.442110   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:25.571507   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:26.181762   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:26.319342   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:26.443442   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:26.572906   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:27.183208   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:27.320956   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:27.444977   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:27.574221   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
E0113 10:33:28.184450   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1323: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BE0113 10:33:28.322257   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:28.446099   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578911449-14627" deleted
namespace "namespace-1578911452-12383" deleted
... skipping 26 lines ...
namespace "namespace-1578911560-737" deleted
namespace "namespace-1578911561-3328" deleted
namespace "namespace-1578911563-17037" deleted
namespace "namespace-1578911565-29899" deleted
namespace "namespace-1578911588-15478" deleted
namespace "namespace-1578911589-21614" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578911449-14627" deleted
... skipping 27 lines ...
namespace "namespace-1578911560-737" deleted
namespace "namespace-1578911561-3328" deleted
namespace "namespace-1578911563-17037" deleted
namespace "namespace-1578911565-29899" deleted
namespace "namespace-1578911588-15478" deleted
namespace "namespace-1578911589-21614" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
E0113 10:33:28.575507   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1335: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1339: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1343: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:33:29.185761   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0113 10:33:29.323578   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1347: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0113 10:33:29.447424   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1349: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
E0113 10:33:29.576891   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1356: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1360: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
E0113 10:33:30.187448   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:30.325075   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:30.448941   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:30.578276   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:30.622466   54599 horizontal.go:353] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1578911589-21614
I0113 10:33:30.627326   54599 horizontal.go:353] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1578911589-21614
E0113 10:33:31.189323   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:31.326461   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:31.450297   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:31.579627   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:32.190696   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:32.328003   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:32.451941   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:32.580954   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:33.192316   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:33.329622   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:33.453849   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:33.585115   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:34.194219   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:34.330592   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:34.455158   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:34.586659   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0113 10:33:35.195472   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_secrets_test
Running command: run_secrets_test

+++ Running case: test-cmd.run_secrets_test 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_secrets_test
+++ [0113 10:33:35] Creating namespace namespace-1578911615-2318
E0113 10:33:35.332011   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911615-2318 created
E0113 10:33:35.456588   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:33:35] Testing secrets
I0113 10:33:35.549793   71558 loader.go:375] Config loaded from file:  /tmp/tmp.TJfwolnO0m/.kube/config
Successful
message:apiVersion: v1
data:
... skipping 27 lines ...
  key1: dmFsdWUx
kind: Secret
metadata:
  creationTimestamp: null
  name: test
has not:example.com
E0113 10:33:35.587998   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(Bnamespace/test-secrets created
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(Bcore.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
E0113 10:33:36.197524   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(BE0113 10:33:36.333185   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:36.457806   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E0113 10:33:36.589190   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(Bsecret "test-secret" deleted
E0113 10:33:37.198921   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:33:37.334444   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0113 10:33:37.459088   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0113 10:33:37.590656   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
secret/test-secret created
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(BI0113 10:33:38.078113   54599 namespace_controller.go:185] Namespace has been deleted my-namespace
secret "test-secret" deleted
E0113 10:33:38.200560   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/secret-string-data created
E0113 10:33:38.335886   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BE0113 10:33:38.460750   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BI0113 10:33:38.590132   54599 namespace_controller.go:185] Namespace has been deleted kube-node-lease
E0113 10:33:38.592196   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:38.619573   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911449-14627
I0113 10:33:38.620786   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911452-12383
I0113 10:33:38.630542   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911476-20513
I0113 10:33:38.638313   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911469-16428
I0113 10:33:38.638315   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911470-7726
I0113 10:33:38.639047   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911474-31533
... skipping 22 lines ...
I0113 10:33:39.139252   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911555-14024
I0113 10:33:39.155932   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911560-13784
namespace "test-secrets" deleted
I0113 10:33:39.179324   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911560-737
I0113 10:33:39.182636   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911555-4160
I0113 10:33:39.188873   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911518-6526
E0113 10:33:39.202007   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:39.265495   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911561-3328
I0113 10:33:39.286075   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911565-29899
I0113 10:33:39.290468   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911563-17037
I0113 10:33:39.301617   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911588-15478
E0113 10:33:39.337371   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:39.379357   54599 namespace_controller.go:185] Namespace has been deleted namespace-1578911589-21614
E0113 10:33:39.462307   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:39.593628   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:40.144818   54599 namespace_controller.go:185] Namespace has been deleted other
E0113 10:33:40.203448   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:40.338689   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:40.463592   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:40.594882   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:41.204880   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:41.340041   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:41.465135   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:41.596334   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:42.206290   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:42.341290   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:42.466250   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:42.597818   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:43.207480   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:43.342513   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:43.467476   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:43.599045   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:44.208708   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0113 10:33:44.343906   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
+++ [0113 10:33:44] Creating namespace namespace-1578911624-23651
E0113 10:33:44.468841   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911624-23651 created
E0113 10:33:44.600330   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:33:44] Testing configmaps
configmap/test-configmap created
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(Bconfigmap "test-configmap" deleted
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(BE0113 10:33:45.209959   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-configmaps created
E0113 10:33:45.345265   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(BE0113 10:33:45.470102   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(BE0113 10:33:45.601693   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
configmap/test-binary-configmap created
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE0113 10:33:46.211110   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0113 10:33:46.346459   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-binary-configmap" deleted
E0113 10:33:46.471439   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "test-configmaps" deleted
E0113 10:33:46.602977   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:47.212422   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:47.348041   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:47.472995   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:47.604582   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:48.213662   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:48.349575   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:48.474369   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:48.605967   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:49.214880   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:49.304970   54599 namespace_controller.go:185] Namespace has been deleted test-secrets
E0113 10:33:49.350883   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:49.475613   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:49.607307   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:50.216306   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:50.352377   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:50.476907   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:50.608708   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:51.217590   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:51.353706   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:51.478176   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:51.609749   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0113 10:33:51] Creating namespace namespace-1578911631-13837
namespace/namespace-1578911631-13837 created
Context "test" modified.
+++ [0113 10:33:51] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0113 10:33:52.218826   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
E0113 10:33:52.355068   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
E0113 10:33:52.479563   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
E0113 10:33:52.611078   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [0113 10:33:52] Creating namespace namespace-1578911632-4093
namespace/namespace-1578911632-4093 created
Context "test" modified.
+++ [0113 10:33:53] Testing service accounts
E0113 10:33:53.220393   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:828: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-service-accounts\" }}found{{end}}{{end}}:: :
(Bnamespace/test-service-accounts created
E0113 10:33:53.356487   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(BE0113 10:33:53.480955   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount/test-service-account created
E0113 10:33:53.612328   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E0113 10:33:54.221949   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:54.358044   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:54.482289   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:54.614036   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:55.223429   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:55.359572   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:55.484180   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:55.615623   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:56.225025   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:56.361375   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:56.485845   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:56.617272   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:33:56.627329   54599 namespace_controller.go:185] Namespace has been deleted test-configmaps
E0113 10:33:57.226650   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:57.362804   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:57.487513   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:57.618856   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:58.228318   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:58.364524   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:58.489330   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:33:58.620563   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_job_tests
+++ [0113 10:33:59] Creating namespace namespace-1578911639-838
namespace/namespace-1578911639-838 created
E0113 10:33:59.229763   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:33:59] Testing job
E0113 10:33:59.366062   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(BE0113 10:33:59.490801   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-jobs created
E0113 10:33:59.622016   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
(Bkubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/pi created
batch.sh:39: Successful get cronjob/pi --namespace=test-jobs {{.metadata.name}}: pi
(BNAME   SCHEDULE       SUSPEND   ACTIVE   LAST SCHEDULE   AGE
pi     59 23 31 2 *   False     0        <none>          0s
... skipping 2 lines ...
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 13 lines ...
    Environment:     <none>
    Mounts:          <none>
  Volumes:           <none>
Last Schedule Time:  <unset>
Active Jobs:         <none>
Events:              <none>
E0113 10:34:00.231307   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:job.batch/test-job
has:job.batch/test-job
E0113 10:34:00.367467   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:48: Successful get jobs {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 10:34:00.492241   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:34:00.516795   54599 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"9253098f-0f18-4c34-a76e-dce17cd4f631", APIVersion:"batch/v1", ResourceVersion:"1497", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-vt56p
job.batch/test-job created
E0113 10:34:00.624499   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:53: Successful get job/test-job --namespace=test-jobs {{.metadata.name}}: test-job
(BNAME       COMPLETIONS   DURATION   AGE
test-job   0/1           0s         0s
Name:           test-job
Namespace:      test-jobs
Selector:       controller-uid=9253098f-0f18-4c34-a76e-dce17cd4f631
... skipping 2 lines ...
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Mon, 13 Jan 2020 10:34:00 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=9253098f-0f18-4c34-a76e-dce17cd4f631
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 14 lines ...
Events:
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  0s    job-controller  Created pod: test-job-vt56p
job.batch "test-job" deleted
cronjob.batch "pi" deleted
E0113 10:34:01.233054   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "test-jobs" deleted
E0113 10:34:01.369536   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:01.493731   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:01.626082   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:02.234947   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:02.371185   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:02.495660   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:02.628038   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:03.236801   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:03.373067   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:03.497102   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:03.629556   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:34:03.937275   54599 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E0113 10:34:04.238344   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:04.374688   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:04.498575   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:04.630933   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:05.240183   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:05.376192   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:05.500013   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:05.632522   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:06.241527   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:06.377381   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests
E0113 10:34:06.501286   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_job_tests
+++ [0113 10:34:06] Creating namespace namespace-1578911646-29869
E0113 10:34:06.633732   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911646-29869 created
Context "test" modified.
I0113 10:34:06.825370   54599 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578911646-29869", Name:"test-job", UID:"c486acf5-179f-42e7-a71f-3ff3b5e1d001", APIVersion:"batch/v1", ResourceVersion:"1518", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-lwz55
job.batch/test-job created
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(Bjob.batch "test-job" deleted
I0113 10:34:07.149867   54599 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578911646-29869", Name:"test-job-pi", UID:"ea37b4ca-67a1-4a99-b530-7aa34c2aeb46", APIVersion:"batch/v1", ResourceVersion:"1525", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-4blmm
job.batch/test-job-pi created
E0113 10:34:07.243401   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(BE0113 10:34:07.378710   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job-pi" deleted
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/test-pi created
E0113 10:34:07.502686   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:34:07.604123   54599 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578911646-29869", Name:"my-pi", UID:"f3416ca0-49a4-40ab-a076-1ace6300b3e5", APIVersion:"batch/v1", ResourceVersion:"1535", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-2ghrg
job.batch/my-pi created
E0113 10:34:07.634974   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:[perl -Mbignum=bpi -wle print bpi(10)]
has:perl -Mbignum=bpi -wle print bpi(10)
job.batch "my-pi" deleted
cronjob.batch "test-pi" deleted
+++ exit code: 0
... skipping 2 lines ...

+++ Running case: test-cmd.run_pod_templates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_pod_templates_tests
+++ [0113 10:34:08] Creating namespace namespace-1578911648-17016
namespace/namespace-1578911648-17016 created
E0113 10:34:08.244880   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:34:08] Testing pod templates
core.sh:1421: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:08.380098   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:08.504086   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:34:08.577709   51163 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
E0113 10:34:08.636341   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1425: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BNAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
core.sh:1433: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
E0113 10:34:09.246260   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1437: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_service_tests
Running command: run_service_tests

+++ Running case: test-cmd.run_service_tests 
E0113 10:34:09.381432   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_tests
Context "test" modified.
+++ [0113 10:34:09] Testing kubectl(v1:services)
E0113 10:34:09.505331   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 10:34:09.637625   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bmatched Name:
matched Labels:
matched Selector:
matched IP:
... skipping 28 lines ...
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(B
E0113 10:34:10.249329   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:868: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 3 lines ...
IP:                10.0.0.98
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
(B
E0113 10:34:10.382635   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:870: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 4 lines ...
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(B
E0113 10:34:10.506664   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
matched Port:
matched Endpoints:
... skipping 25 lines ...
IP:                10.0.0.98
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0113 10:34:10.639065   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 94 lines ...
  - port: 6379
    targetPort: 6379
  selector:
    role: padawan
status:
  loadBalancer: {}
E0113 10:34:11.250729   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2020-01-13T10:34:09Z"
  labels:
    app: redis
... skipping 14 lines ...
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
I0113 10:34:11.382870   54599 namespace_controller.go:185] Namespace has been deleted test-jobs
E0113 10:34:11.383959   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
E0113 10:34:11.508271   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:890: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: padawan:
(BE0113 10:34:11.640497   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
core.sh:894: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BapiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2020-01-13T10:34:09Z"
... skipping 15 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0113 10:34:12.252340   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
E0113 10:34:12.385140   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
E0113 10:34:12.509637   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:12.642041   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:911: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:918: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:922: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
E0113 10:34:13.253334   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:926: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0113 10:34:13.386491   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:930: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0113 10:34:13.511097   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:13.643282   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test created
core.sh:951: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice/service-v1-test replaced
core.sh:958: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
E0113 10:34:14.254725   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "service-v1-test" deleted
E0113 10:34:14.388536   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:966: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 10:34:14.512689   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:970: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 10:34:14.644876   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
service/redis-slave created
core.sh:975: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BE0113 10:34:15.256487   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME           RSRC
kubernetes     144
redis-master   1577
redis-slave    1580
has:redis-master
E0113 10:34:15.389994   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:985: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BE0113 10:34:15.514060   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
service "redis-slave" deleted
E0113 10:34:15.646112   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:992: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:996: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/beep-boop created
core.sh:1000: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bcore.sh:1004: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bservice "beep-boop" deleted
E0113 10:34:16.257933   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1011: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 10:34:16.391431   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1015: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:16.515904   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I0113 10:34:16.557358   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"e14a5c10-366f-4082-b479-8a891642323b", APIVersion:"apps/v1", ResourceVersion:"1594", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I0113 10:34:16.570812   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"1c82b7f8-01a0-4b46-b6bb-29223a415673", APIVersion:"apps/v1", ResourceVersion:"1595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-dvjhr
I0113 10:34:16.576991   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"1c82b7f8-01a0-4b46-b6bb-29223a415673", APIVersion:"apps/v1", ResourceVersion:"1595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-l6vgn
service/testmetadata created
deployment.apps/testmetadata created
E0113 10:34:16.647554   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1019: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(Bcore.sh:1020: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(Bservice/exposemetadata exposed
core.sh:1026: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(Bservice "exposemetadata" deleted
service "testmetadata" deleted
E0113 10:34:17.259366   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "testmetadata" deleted
+++ exit code: 0
E0113 10:34:17.392992   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_daemonset_tests
Running command: run_daemonset_tests

+++ Running case: test-cmd.run_daemonset_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_tests
+++ [0113 10:34:17] Creating namespace namespace-1578911657-9649
E0113 10:34:17.517251   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911657-9649 created
E0113 10:34:17.649093   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:34:17] Testing kubectl(v1:daemonsets)
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0113 10:34:18.028910   51163 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I0113 10:34:18.038898   51163 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0113 10:34:18.261085   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:18.394501   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
E0113 10:34:18.519302   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0113 10:34:18.651999   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind image updated
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(Bdaemonset.apps/bind env updated
E0113 10:34:19.262979   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(BE0113 10:34:19.396368   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind resource requirements updated
E0113 10:34:19.521870   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(BE0113 10:34:19.654048   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind restarted
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
E0113 10:34:20.264886   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_daemonset_history_tests
Running command: run_daemonset_history_tests

E0113 10:34:20.399888   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ Running case: test-cmd.run_daemonset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_history_tests
+++ [0113 10:34:20] Creating namespace namespace-1578911660-5765
E0113 10:34:20.523661   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911660-5765 created
E0113 10:34:20.656826   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:34:20] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdaemonset.apps/bind created
E0113 10:34:21.266991   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:21.401877   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578911660-5765"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0113 10:34:21.525404   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:21.658554   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind skipped rollback (current template already matches revision 1)
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0113 10:34:22.268790   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
E0113 10:34:22.403448   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:22.527269   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0113 10:34:22.660832   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578911660-5765"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578911660-5765"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0113 10:34:23.270871   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:23.405429   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind will roll back to Pod Template:
  Labels:	service=bind
  Containers:
   kubernetes-pause:
    Image:	k8s.gcr.io/pause:2.0
    Port:	<none>
    Host Port:	<none>
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
E0113 10:34:23.529434   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:83: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0113 10:34:23.662130   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
E0113 10:34:24.066229   54599 daemon_controller.go:291] namespace-1578911660-5765/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1578911660-5765", SelfLink:"/apis/apps/v1/namespaces/namespace-1578911660-5765/daemonsets/bind", UID:"5ecfe57c-e5bc-4786-aac7-dbec87be0dcf", ResourceVersion:"1666", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714508461, loc:(*time.Location)(0x6b12b60)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1578911660-5765\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001c3c600), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002e89858), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc00233cc00), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001c3c640), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc0000e2778)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002e898ac)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0113 10:34:24.272363   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0113 10:34:24.406618   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
E0113 10:34:24.530746   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0113 10:34:24.663599   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind rolled back
E0113 10:34:24.850843   54599 daemon_controller.go:291] namespace-1578911660-5765/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1578911660-5765", SelfLink:"/apis/apps/v1/namespaces/namespace-1578911660-5765/daemonsets/bind", UID:"5ecfe57c-e5bc-4786-aac7-dbec87be0dcf", ResourceVersion:"1669", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714508461, loc:(*time.Location)(0x6b12b60)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1578911660-5765\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc000d0ae80), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002341948), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0024d06c0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc000d0aea0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc0000e2b50)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc00234199c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0113 10:34:25.273558   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
Running command: run_rc_tests
E0113 10:34:25.408158   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_rc_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rc_tests
+++ [0113 10:34:25] Creating namespace namespace-1578911665-12565
namespace/namespace-1578911665-12565 created
E0113 10:34:25.532055   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:34:25] Testing kubectl(v1:replicationcontrollers)
E0113 10:34:25.665092   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1052: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0113 10:34:25.925137   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"6b75e8fd-3a1a-424d-8f11-f4dd3907f97f", APIVersion:"v1", ResourceVersion:"1679", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5cmt6
I0113 10:34:25.927975   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"6b75e8fd-3a1a-424d-8f11-f4dd3907f97f", APIVersion:"v1", ResourceVersion:"1679", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gqf85
I0113 10:34:25.929679   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"6b75e8fd-3a1a-424d-8f11-f4dd3907f97f", APIVersion:"v1", ResourceVersion:"1679", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dx2cb
replicationcontroller "frontend" deleted
core.sh:1057: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1061: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:26.274838   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:26.409401   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0113 10:34:26.465984   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"b97239b2-16f9-49e0-91ca-1e60ef0443bc", APIVersion:"v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-q56kd
I0113 10:34:26.469955   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"b97239b2-16f9-49e0-91ca-1e60ef0443bc", APIVersion:"v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zc5kq
I0113 10:34:26.470063   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"b97239b2-16f9-49e0-91ca-1e60ef0443bc", APIVersion:"v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fgtxj
E0113 10:34:26.533475   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1065: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0113 10:34:26.666329   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 4 lines ...
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-q56kd
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-zc5kq
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-fgtxj
(B
matched Name:
matched Name:
E0113 10:34:27.276310   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
matched Volumes:
... skipping 3 lines ...
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-q56kd
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-zc5kq
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-fgtxj
(BE0113 10:34:27.410808   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-q56kd
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-zc5kq
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-fgtxj
(BE0113 10:34:27.536154   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE0113 10:34:27.667378   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578911665-12565
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1085: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0113 10:34:27.908496   54599 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578911665-12565 /api/v1/namespaces/namespace-1578911665-12565/replicationcontrollers/frontend b97239b2-16f9-49e0-91ca-1e60ef0443bc 1706 2 2020-01-13 10:34:26 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-13 10:34:26 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-13 10:34:26 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002fe95d8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0113 10:34:27.917405   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"b97239b2-16f9-49e0-91ca-1e60ef0443bc", APIVersion:"v1", ResourceVersion:"1706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-q56kd
core.sh:1089: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1093: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
E0113 10:34:28.277581   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1097: Successful get rc frontend {{.spec.replicas}}: 2
(BE0113 10:34:28.412247   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1101: Successful get rc frontend {{.spec.replicas}}: 2
(BE0113 10:34:28.537586   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
I0113 10:34:28.590853   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"b97239b2-16f9-49e0-91ca-1e60ef0443bc", APIVersion:"v1", ResourceVersion:"1712", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mkgcc
E0113 10:34:28.668527   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1105: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1109: Successful get rc frontend {{.spec.replicas}}: 3
(BE0113 10:34:28.924869   54599 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578911665-12565 /api/v1/namespaces/namespace-1578911665-12565/replicationcontrollers/frontend b97239b2-16f9-49e0-91ca-1e60ef0443bc 1717 4 2020-01-13 10:34:26 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-13 10:34:26 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-13 10:34:28 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002361cb8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
replicationcontroller/frontend scaled
I0113 10:34:28.932662   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"b97239b2-16f9-49e0-91ca-1e60ef0443bc", APIVersion:"v1", ResourceVersion:"1717", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-mkgcc
core.sh:1113: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
E0113 10:34:29.278980   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master created
I0113 10:34:29.372080   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-master", UID:"9d0f4234-af21-47e6-8c0e-c197ea4bd057", APIVersion:"v1", ResourceVersion:"1730", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-s9ztx
E0113 10:34:29.413515   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:29.538910   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0113 10:34:29.582088   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-slave", UID:"223aa345-6e13-49f4-a6f8-cc825a0e7960", APIVersion:"v1", ResourceVersion:"1735", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-bwsp2
I0113 10:34:29.587640   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-slave", UID:"223aa345-6e13-49f4-a6f8-cc825a0e7960", APIVersion:"v1", ResourceVersion:"1735", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-zcpdl
E0113 10:34:29.669748   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master scaled
I0113 10:34:29.701378   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-master", UID:"9d0f4234-af21-47e6-8c0e-c197ea4bd057", APIVersion:"v1", ResourceVersion:"1742", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-kqkjg
I0113 10:34:29.705874   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-master", UID:"9d0f4234-af21-47e6-8c0e-c197ea4bd057", APIVersion:"v1", ResourceVersion:"1742", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-mp4wl
I0113 10:34:29.705946   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-master", UID:"9d0f4234-af21-47e6-8c0e-c197ea4bd057", APIVersion:"v1", ResourceVersion:"1742", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-fvr2l
replicationcontroller/redis-slave scaled
I0113 10:34:29.711491   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-slave", UID:"223aa345-6e13-49f4-a6f8-cc825a0e7960", APIVersion:"v1", ResourceVersion:"1744", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-4ldtn
I0113 10:34:29.715294   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-slave", UID:"223aa345-6e13-49f4-a6f8-cc825a0e7960", APIVersion:"v1", ResourceVersion:"1744", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-h528n
core.sh:1123: Successful get rc redis-master {{.spec.replicas}}: 4
(Bcore.sh:1124: Successful get rc redis-slave {{.spec.replicas}}: 4
(Breplicationcontroller "redis-master" deleted
replicationcontroller "redis-slave" deleted
E0113 10:34:30.280534   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 10:34:30.335189   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment", UID:"6ab2d954-661c-433c-bf6d-67ef36153ebf", APIVersion:"apps/v1", ResourceVersion:"1777", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 10:34:30.341114   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"09ad9e16-cb42-453f-9d0a-1ffabe2d6aa7", APIVersion:"apps/v1", ResourceVersion:"1778", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-n4k5s
I0113 10:34:30.345283   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"09ad9e16-cb42-453f-9d0a-1ffabe2d6aa7", APIVersion:"apps/v1", ResourceVersion:"1778", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-hs6ls
I0113 10:34:30.348015   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"09ad9e16-cb42-453f-9d0a-1ffabe2d6aa7", APIVersion:"apps/v1", ResourceVersion:"1778", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-v8rbg
E0113 10:34:30.414731   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment scaled
I0113 10:34:30.476357   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment", UID:"6ab2d954-661c-433c-bf6d-67ef36153ebf", APIVersion:"apps/v1", ResourceVersion:"1791", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6986c7bc94 to 1
I0113 10:34:30.490443   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"09ad9e16-cb42-453f-9d0a-1ffabe2d6aa7", APIVersion:"apps/v1", ResourceVersion:"1792", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-hs6ls
I0113 10:34:30.495196   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"09ad9e16-cb42-453f-9d0a-1ffabe2d6aa7", APIVersion:"apps/v1", ResourceVersion:"1792", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-n4k5s
E0113 10:34:30.540588   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1133: Successful get deployment nginx-deployment {{.spec.replicas}}: 1
(BE0113 10:34:30.671504   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
E0113 10:34:31.282000   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 10:34:31.326993   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment", UID:"82a5ad95-e73a-4251-a076-426009c384b4", APIVersion:"apps/v1", ResourceVersion:"1815", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 10:34:31.331286   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"f25b976d-f45e-48b5-972b-8d322b2cf7eb", APIVersion:"apps/v1", ResourceVersion:"1816", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-hktgr
I0113 10:34:31.337829   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"f25b976d-f45e-48b5-972b-8d322b2cf7eb", APIVersion:"apps/v1", ResourceVersion:"1816", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-4sd4w
I0113 10:34:31.338196   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-6986c7bc94", UID:"f25b976d-f45e-48b5-972b-8d322b2cf7eb", APIVersion:"apps/v1", ResourceVersion:"1816", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-6tvnw
E0113 10:34:31.416482   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1152: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(BE0113 10:34:31.541808   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/nginx-deployment exposed
E0113 10:34:31.672789   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1156: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(Bdeployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
replicationcontroller/frontend created
I0113 10:34:32.083532   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"cd01bcfc-ceb0-4db3-b3ab-a491bd7e1502", APIVersion:"v1", ResourceVersion:"1845", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ffhr2
I0113 10:34:32.086637   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"cd01bcfc-ceb0-4db3-b3ab-a491bd7e1502", APIVersion:"v1", ResourceVersion:"1845", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9l685
I0113 10:34:32.093133   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"cd01bcfc-ceb0-4db3-b3ab-a491bd7e1502", APIVersion:"v1", ResourceVersion:"1845", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bltmz
core.sh:1163: Successful get rc frontend {{.spec.replicas}}: 3
(BE0113 10:34:32.283419   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
E0113 10:34:32.417850   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1167: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BE0113 10:34:32.543181   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-2 exposed
E0113 10:34:32.674092   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1171: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(Bpod/valid-pod created
service/frontend-3 exposed
core.sh:1176: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(BE0113 10:34:33.284796   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-4 exposed
core.sh:1180: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(BE0113 10:34:33.419176   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-5 exposed
E0113 10:34:33.544414   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1184: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(BE0113 10:34:33.675275   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
E0113 10:34:34.286101   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
E0113 10:34:34.420595   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:34.545742   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/etcd-server exposed
has:etcd-server exposed
core.sh:1214: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(BE0113 10:34:34.676729   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1215: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(Bservice "etcd-server" deleted
core.sh:1221: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicationcontroller "frontend" deleted
core.sh:1225: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:35.287297   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1229: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:35.422573   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
E0113 10:34:35.546958   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:34:35.552925   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"121b6e9c-a9d2-4ffc-96db-23712fa3fac1", APIVersion:"v1", ResourceVersion:"1910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bpj2f
I0113 10:34:35.556847   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"121b6e9c-a9d2-4ffc-96db-23712fa3fac1", APIVersion:"v1", ResourceVersion:"1910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-vtfvn
I0113 10:34:35.557003   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"121b6e9c-a9d2-4ffc-96db-23712fa3fac1", APIVersion:"v1", ResourceVersion:"1910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dkx6j
E0113 10:34:35.677996   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0113 10:34:35.761781   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-slave", UID:"3879e83e-d055-475d-8740-bba3083cee91", APIVersion:"v1", ResourceVersion:"1919", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-gptj5
I0113 10:34:35.767154   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"redis-slave", UID:"3879e83e-d055-475d-8740-bba3083cee91", APIVersion:"v1", ResourceVersion:"1919", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-drcf6
core.sh:1234: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bcore.sh:1238: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
core.sh:1242: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:36.289037   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1246: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:36.424207   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0113 10:34:36.547603   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"a58f244f-9e49-4e7f-880b-b8b93ab22350", APIVersion:"v1", ResourceVersion:"1938", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6r897
E0113 10:34:36.548474   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:34:36.551364   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"a58f244f-9e49-4e7f-880b-b8b93ab22350", APIVersion:"v1", ResourceVersion:"1938", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-h4l76
I0113 10:34:36.552490   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911665-12565", Name:"frontend", UID:"a58f244f-9e49-4e7f-880b-b8b93ab22350", APIVersion:"v1", ResourceVersion:"1938", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-894mz
core.sh:1249: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0113 10:34:36.679435   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1252: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1256: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE0113 10:34:37.290454   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
E0113 10:34:37.425488   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
E0113 10:34:37.549763   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:37.680751   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
  labels:
    name: nginx-deployment-resources
... skipping 22 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I0113 10:34:38.019103   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources", UID:"c436f937-537f-4ad0-909d-f482ac44ac4d", APIVersion:"apps/v1", ResourceVersion:"1960", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I0113 10:34:38.023447   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-67f8cfff5", UID:"913258a5-5ed1-446d-a840-50f7db18e8c1", APIVersion:"apps/v1", ResourceVersion:"1961", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-kd9wv
I0113 10:34:38.032473   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-67f8cfff5", UID:"913258a5-5ed1-446d-a840-50f7db18e8c1", APIVersion:"apps/v1", ResourceVersion:"1961", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-2pc9l
I0113 10:34:38.032523   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-67f8cfff5", UID:"913258a5-5ed1-446d-a840-50f7db18e8c1", APIVersion:"apps/v1", ResourceVersion:"1961", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-bdrcx
core.sh:1271: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1272: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 10:34:38.291873   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0113 10:34:38.427088   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I0113 10:34:38.508607   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources", UID:"c436f937-537f-4ad0-909d-f482ac44ac4d", APIVersion:"apps/v1", ResourceVersion:"1974", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I0113 10:34:38.512523   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-55c547f795", UID:"57e3260e-1eed-4e5a-b7af-198479d0eceb", APIVersion:"apps/v1", ResourceVersion:"1975", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-sdg9l
E0113 10:34:38.551247   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(BE0113 10:34:38.681968   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I0113 10:34:38.977501   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources", UID:"c436f937-537f-4ad0-909d-f482ac44ac4d", APIVersion:"apps/v1", ResourceVersion:"1984", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-55c547f795 to 0
I0113 10:34:38.986157   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources", UID:"c436f937-537f-4ad0-909d-f482ac44ac4d", APIVersion:"apps/v1", ResourceVersion:"1986", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I0113 10:34:38.986624   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-55c547f795", UID:"57e3260e-1eed-4e5a-b7af-198479d0eceb", APIVersion:"apps/v1", ResourceVersion:"1988", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-55c547f795-sdg9l
I0113 10:34:38.993681   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-6d86564b45", UID:"bb1150eb-1524-4a0b-a53f-b6243c99fe3e", APIVersion:"apps/v1", ResourceVersion:"1991", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-9kd7b
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(BE0113 10:34:39.293213   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I0113 10:34:39.332591   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources", UID:"c436f937-537f-4ad0-909d-f482ac44ac4d", APIVersion:"apps/v1", ResourceVersion:"2004", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
I0113 10:34:39.339242   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources", UID:"c436f937-537f-4ad0-909d-f482ac44ac4d", APIVersion:"apps/v1", ResourceVersion:"2006", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
I0113 10:34:39.343523   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-67f8cfff5", UID:"913258a5-5ed1-446d-a840-50f7db18e8c1", APIVersion:"apps/v1", ResourceVersion:"2008", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-bdrcx
I0113 10:34:39.345647   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911665-12565", Name:"nginx-deployment-resources-6c478d4fdb", UID:"89ef1624-908a-41be-bf44-5c04b10e2eb1", APIVersion:"apps/v1", ResourceVersion:"2011", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-smf4q
E0113 10:34:39.428835   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0113 10:34:39.552521   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE0113 10:34:39.683320   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BapiVersion: apps/v1
kind: Deployment
metadata:
  annotations:
    deployment.kubernetes.io/revision: "4"
... skipping 66 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1292: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1293: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1294: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE0113 10:34:40.294750   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment-resources" deleted
+++ exit code: 0
E0113 10:34:40.430349   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_deployment_tests
Running command: run_deployment_tests

+++ Running case: test-cmd.run_deployment_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_deployment_tests
+++ [0113 10:34:40] Creating namespace namespace-1578911680-2310
E0113 10:34:40.553896   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911680-2310 created
E0113 10:34:40.685021   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:34:40] Testing deployments
deployment.apps/test-nginx-extensions created
I0113 10:34:40.864107   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"test-nginx-extensions", UID:"c6543827-8a2c-4155-b723-d6171dd73d36", APIVersion:"apps/v1", ResourceVersion:"2043", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
I0113 10:34:40.873595   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"test-nginx-extensions-5559c76db7", UID:"b7780290-ea3a-45b6-8192-ee118f357df9", APIVersion:"apps/v1", ResourceVersion:"2044", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-bpl56
apps.sh:185: Successful get deploy test-nginx-extensions {{(index .spec.template.spec.containers 0).name}}: nginx
(BSuccessful
message:10
has not:2
Successful
message:apps/v1
has:apps/v1
E0113 10:34:41.296155   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-nginx-extensions" deleted
E0113 10:34:41.433280   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-nginx-apps created
I0113 10:34:41.441001   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"test-nginx-apps", UID:"1157a243-eb11-4959-abc1-060bb95aae4b", APIVersion:"apps/v1", ResourceVersion:"2058", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
I0113 10:34:41.448789   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"test-nginx-apps-79b9bd9585", UID:"20beb82f-c9e5-42e5-bf24-bf04e13128c6", APIVersion:"apps/v1", ResourceVersion:"2060", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-cv4kn
E0113 10:34:41.555166   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:198: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0113 10:34:41.686460   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has:10
Successful
message:apps/v1
has:apps/v1
... skipping 13 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 34 lines ...
Volumes:          <none>
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
(Bdeployment.apps "test-nginx-apps" deleted
E0113 10:34:42.297515   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:42.434372   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-with-command created
I0113 10:34:42.451856   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-with-command", UID:"1bea3be7-1ea1-4cbd-ab65-9df0d459b5b7", APIVersion:"apps/v1", ResourceVersion:"2073", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I0113 10:34:42.456871   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-with-command-757c6f58dd", UID:"db5a12a7-da4e-4631-8219-0aa33ca54327", APIVersion:"apps/v1", ResourceVersion:"2074", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-q5hwc
E0113 10:34:42.557083   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(Bdeployment.apps "nginx-with-command" deleted
E0113 10:34:42.687545   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/deployment-with-unixuserid created
I0113 10:34:43.031509   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"deployment-with-unixuserid", UID:"f5e4a34c-d3aa-4d62-9f54-93c4ada5888d", APIVersion:"apps/v1", ResourceVersion:"2087", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I0113 10:34:43.036352   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"fea978fe-4126-46d4-a288-dcf06b3dd577", APIVersion:"apps/v1", ResourceVersion:"2088", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-lmvkb
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
E0113 10:34:43.298805   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:43.435717   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:43.558347   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 10:34:43.613482   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"36955fb6-06e6-4fdc-a024-7ae32b6bc51e", APIVersion:"apps/v1", ResourceVersion:"2103", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 10:34:43.618695   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6986c7bc94", UID:"2176f4f4-3f02-4cff-b696-9d6acf75d8ac", APIVersion:"apps/v1", ResourceVersion:"2104", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-6zv94
I0113 10:34:43.622075   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6986c7bc94", UID:"2176f4f4-3f02-4cff-b696-9d6acf75d8ac", APIVersion:"apps/v1", ResourceVersion:"2104", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-6bsw7
I0113 10:34:43.622509   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6986c7bc94", UID:"2176f4f4-3f02-4cff-b696-9d6acf75d8ac", APIVersion:"apps/v1", ResourceVersion:"2104", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-8jm7d
E0113 10:34:43.688888   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:44.300286   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 10:34:44.306962   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"46295252-be2e-4332-a873-f2b2c3f8029c", APIVersion:"apps/v1", ResourceVersion:"2125", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I0113 10:34:44.310528   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-7f6fc565b9", UID:"1024c124-53ab-47b8-99f3-7bcf268eeff7", APIVersion:"apps/v1", ResourceVersion:"2126", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-bnw89
E0113 10:34:44.439414   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Bdeployment.apps "nginx-deployment" deleted
E0113 10:34:44.560146   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:44.690288   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Breplicaset.apps "nginx-deployment-7f6fc565b9" deleted
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:45.301583   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 10:34:45.331122   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"7602651e-5e20-4e12-9416-7ba7bcd8f8c5", APIVersion:"apps/v1", ResourceVersion:"2143", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 10:34:45.334147   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6986c7bc94", UID:"622bb6be-8b5b-4007-a1c8-7d4519f0d250", APIVersion:"apps/v1", ResourceVersion:"2144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-vsqjr
I0113 10:34:45.337782   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6986c7bc94", UID:"622bb6be-8b5b-4007-a1c8-7d4519f0d250", APIVersion:"apps/v1", ResourceVersion:"2144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-xcm28
I0113 10:34:45.340295   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6986c7bc94", UID:"622bb6be-8b5b-4007-a1c8-7d4519f0d250", APIVersion:"apps/v1", ResourceVersion:"2144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-bs4bs
E0113 10:34:45.440721   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0113 10:34:45.561469   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
E0113 10:34:45.691582   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
deployment.apps "nginx-deployment" deleted
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0113 10:34:46.252632   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx", UID:"add25575-3b11-4622-9e1e-eae21964dfb1", APIVersion:"apps/v1", ResourceVersion:"2169", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0113 10:34:46.257812   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-f87d999f7", UID:"59e8e9d7-7a5a-4c45-a7a1-167b5bd5d8ae", APIVersion:"apps/v1", ResourceVersion:"2170", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-j85pn
I0113 10:34:46.261717   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-f87d999f7", UID:"59e8e9d7-7a5a-4c45-a7a1-167b5bd5d8ae", APIVersion:"apps/v1", ResourceVersion:"2170", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-vtqst
I0113 10:34:46.261809   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-f87d999f7", UID:"59e8e9d7-7a5a-4c45-a7a1-167b5bd5d8ae", APIVersion:"apps/v1", ResourceVersion:"2170", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-ztrsl
E0113 10:34:46.302893   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0113 10:34:46.442147   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 10:34:46.562784   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx skipped rollback (current template already matches revision 1)
E0113 10:34:46.693294   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BWarning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I0113 10:34:46.936174   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx", UID:"add25575-3b11-4622-9e1e-eae21964dfb1", APIVersion:"apps/v1", ResourceVersion:"2183", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I0113 10:34:46.957334   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-78487f9fd7", UID:"831ad317-89ad-430b-80fe-2ae617b62bb6", APIVersion:"apps/v1", ResourceVersion:"2184", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-gmvmd
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
E0113 10:34:47.304492   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
E0113 10:34:47.443273   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:47.564424   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:47.694621   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:48.305812   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:48.444755   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:48.565634   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 10:34:48.695962   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find specified revision 1000000 in history
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
E0113 10:34:49.307039   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:49.445975   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:49.567100   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:49.697266   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
E0113 10:34:50.308545   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
E0113 10:34:50.449221   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
E0113 10:34:50.568569   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx resumed
E0113 10:34:50.698575   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
E0113 10:34:51.309835   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx restarted
I0113 10:34:51.346194   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx", UID:"add25575-3b11-4622-9e1e-eae21964dfb1", APIVersion:"apps/v1", ResourceVersion:"2216", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-78487f9fd7 to 0
I0113 10:34:51.355432   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-78487f9fd7", UID:"831ad317-89ad-430b-80fe-2ae617b62bb6", APIVersion:"apps/v1", ResourceVersion:"2220", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-78487f9fd7-gmvmd
I0113 10:34:51.357057   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx", UID:"add25575-3b11-4622-9e1e-eae21964dfb1", APIVersion:"apps/v1", ResourceVersion:"2219", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-67bd477485 to 1
I0113 10:34:51.366250   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-67bd477485", UID:"41530556-26a8-4f83-9c3f-4282890fdeea", APIVersion:"apps/v1", ResourceVersion:"2224", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-67bd477485-f4s99
E0113 10:34:51.450638   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:51.570049   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:51.700026   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:34:51.771991   54599 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578911665-12565
E0113 10:34:52.311334   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:52.452041   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:52.571257   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 48 lines ...
      terminationGracePeriodSeconds: 30
status:
  fullyLabeledReplicas: 1
  observedGeneration: 2
  replicas: 1
has:deployment.kubernetes.io/revision: "6"
E0113 10:34:52.701274   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx2 created
I0113 10:34:52.805869   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx2", UID:"77714b16-0dcd-496a-8b35-14274a88a4b6", APIVersion:"apps/v1", ResourceVersion:"2239", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
I0113 10:34:52.811985   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx2-57b7865cd9", UID:"c9989df4-278a-4857-9608-c2d47a06c13f", APIVersion:"apps/v1", ResourceVersion:"2240", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-6bwgf
I0113 10:34:52.818739   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx2-57b7865cd9", UID:"c9989df4-278a-4857-9608-c2d47a06c13f", APIVersion:"apps/v1", ResourceVersion:"2240", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-lnsms
I0113 10:34:52.818777   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx2-57b7865cd9", UID:"c9989df4-278a-4857-9608-c2d47a06c13f", APIVersion:"apps/v1", ResourceVersion:"2240", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-f9dkc
deployment.apps "nginx2" deleted
deployment.apps "nginx" deleted
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:53.312939   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 10:34:53.382812   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"010df211-20b5-4d15-bf60-845e42220b58", APIVersion:"apps/v1", ResourceVersion:"2273", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0113 10:34:53.386510   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"d8f31bf0-56c8-49f8-a5dd-fd7d90a147c6", APIVersion:"apps/v1", ResourceVersion:"2274", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-hqpns
I0113 10:34:53.393865   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"d8f31bf0-56c8-49f8-a5dd-fd7d90a147c6", APIVersion:"apps/v1", ResourceVersion:"2274", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-pd4mn
I0113 10:34:53.393917   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"d8f31bf0-56c8-49f8-a5dd-fd7d90a147c6", APIVersion:"apps/v1", ResourceVersion:"2274", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-9lxg6
E0113 10:34:53.453521   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0113 10:34:53.572782   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 10:34:53.702682   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0113 10:34:53.886722   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"010df211-20b5-4d15-bf60-845e42220b58", APIVersion:"apps/v1", ResourceVersion:"2289", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I0113 10:34:53.892358   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-59df9b5f5b", UID:"2e41ffdd-816f-4cdb-bbc6-316418c2417b", APIVersion:"apps/v1", ResourceVersion:"2290", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-jnzkn
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
E0113 10:34:54.314234   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
E0113 10:34:54.455906   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 10:34:54.574024   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
E0113 10:34:54.703555   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bapps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0113 10:34:55.315459   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
I0113 10:34:55.406767   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"010df211-20b5-4d15-bf60-845e42220b58", APIVersion:"apps/v1", ResourceVersion:"2307", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0113 10:34:55.414431   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"d8f31bf0-56c8-49f8-a5dd-fd7d90a147c6", APIVersion:"apps/v1", ResourceVersion:"2311", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-hqpns
I0113 10:34:55.420453   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"010df211-20b5-4d15-bf60-845e42220b58", APIVersion:"apps/v1", ResourceVersion:"2309", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I0113 10:34:55.440987   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-7d758dbc54", UID:"f324468a-91df-4b91-9d54-4051c6d24aee", APIVersion:"apps/v1", ResourceVersion:"2317", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-vvtlr
E0113 10:34:55.456876   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 10:34:55.575211   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 10:34:55.705175   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:56.316919   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 10:34:56.396764   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2342", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0113 10:34:56.399899   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"543fde2e-3b70-484c-8cad-80eae1724756", APIVersion:"apps/v1", ResourceVersion:"2343", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-k7flh
I0113 10:34:56.404632   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"543fde2e-3b70-484c-8cad-80eae1724756", APIVersion:"apps/v1", ResourceVersion:"2343", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-r87sb
I0113 10:34:56.406391   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"543fde2e-3b70-484c-8cad-80eae1724756", APIVersion:"apps/v1", ResourceVersion:"2343", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-cdmv2
E0113 10:34:56.458117   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:56.576621   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-set-env-config created
E0113 10:34:56.706570   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-set-env-secret created
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(Bapps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(Bdeployment.apps/nginx-deployment env updated
I0113 10:34:57.309662   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2358", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I0113 10:34:57.318111   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6b9f7756b4", UID:"0a78ccf9-8208-4208-973a-0df9afda227b", APIVersion:"apps/v1", ResourceVersion:"2359", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-pbkdf
E0113 10:34:57.318431   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
(BE0113 10:34:57.459426   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
(BE0113 10:34:57.577901   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0113 10:34:57.683904   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2370", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0113 10:34:57.692034   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"543fde2e-3b70-484c-8cad-80eae1724756", APIVersion:"apps/v1", ResourceVersion:"2374", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-k7flh
I0113 10:34:57.700409   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2373", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-754bf964c8 to 1
I0113 10:34:57.705768   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-754bf964c8", UID:"5f20822f-314c-4bab-abdd-e91a29a98749", APIVersion:"apps/v1", ResourceVersion:"2380", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-754bf964c8-t9jjt
E0113 10:34:57.707390   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:389: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
(Bdeployment.apps/nginx-deployment env updated
I0113 10:34:57.934385   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2390", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 1
I0113 10:34:57.942780   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-598d4d68b4", UID:"543fde2e-3b70-484c-8cad-80eae1724756", APIVersion:"apps/v1", ResourceVersion:"2394", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-r87sb
I0113 10:34:57.945469   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2393", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-c6d5c5c7b to 1
I0113 10:34:57.952366   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-c6d5c5c7b", UID:"43e963c0-6035-4943-8d78-1dc8f5a1de03", APIVersion:"apps/v1", ResourceVersion:"2398", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-c6d5c5c7b-bpplm
... skipping 3 lines ...
I0113 10:34:58.077773   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2412", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
I0113 10:34:58.085003   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-5958f7687", UID:"cc91abf7-3525-4485-bdd8-23c2db28847f", APIVersion:"apps/v1", ResourceVersion:"2418", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-m6k4j
deployment.apps/nginx-deployment env updated
I0113 10:34:58.228329   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2431", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6b9f7756b4 to 0
I0113 10:34:58.236297   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-6b9f7756b4", UID:"0a78ccf9-8208-4208-973a-0df9afda227b", APIVersion:"apps/v1", ResourceVersion:"2435", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6b9f7756b4-pbkdf
I0113 10:34:58.240681   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2434", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-98b7fd455 to 1
E0113 10:34:58.324096   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0113 10:34:58.375850   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-98b7fd455", UID:"98f261cd-7b3c-4b27-b1f3-259241e0090d", APIVersion:"apps/v1", ResourceVersion:"2439", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-98b7fd455-hlz9w
E0113 10:34:58.460771   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0113 10:34:58.509940   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2444", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-754bf964c8 to 0
E0113 10:34:58.579037   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
I0113 10:34:58.660299   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment", UID:"00fa2458-c60e-4850-aef0-7632751bfa0c", APIVersion:"apps/v1", ResourceVersion:"2449", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-868b664cb5 to 1
I0113 10:34:58.674003   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-754bf964c8", UID:"5f20822f-314c-4bab-abdd-e91a29a98749", APIVersion:"apps/v1", ResourceVersion:"2451", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-754bf964c8-t9jjt
configmap "test-set-env-config" deleted
E0113 10:34:58.708559   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:58.774038   54599 replica_set.go:534] sync "namespace-1578911680-2310/nginx-deployment-6b9f7756b4" failed with replicasets.apps "nginx-deployment-6b9f7756b4" not found
secret "test-set-env-secret" deleted
I0113 10:34:58.826087   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911680-2310", Name:"nginx-deployment-868b664cb5", UID:"6341044a-edd7-45b5-9cb0-0b77f9bec12d", APIVersion:"apps/v1", ResourceVersion:"2471", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-868b664cb5-trlqb
+++ exit code: 0
Recording: run_rs_tests
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [0113 10:34:58] Creating namespace namespace-1578911698-31004
E0113 10:34:58.973956   54599 replica_set.go:534] sync "namespace-1578911680-2310/nginx-deployment-98b7fd455" failed with replicasets.apps "nginx-deployment-98b7fd455" not found
namespace/namespace-1578911698-31004 created
Context "test" modified.
E0113 10:34:59.124185   54599 replica_set.go:534] sync "namespace-1578911680-2310/nginx-deployment-754bf964c8" failed with replicasets.apps "nginx-deployment-754bf964c8" not found
+++ [0113 10:34:59] Testing kubectl(v1:replicasets)
E0113 10:34:59.173923   54599 replica_set.go:534] sync "namespace-1578911680-2310/nginx-deployment-868b664cb5" failed with replicasets.apps "nginx-deployment-868b664cb5" not found
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:59.325667   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0113 10:34:59.446069   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"4080c9c3-2ab0-47a5-9eb7-75e0493ca7a1", APIVersion:"apps/v1", ResourceVersion:"2483", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9vhvp
I0113 10:34:59.450711   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"4080c9c3-2ab0-47a5-9eb7-75e0493ca7a1", APIVersion:"apps/v1", ResourceVersion:"2483", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-t96xg
I0113 10:34:59.450743   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"4080c9c3-2ab0-47a5-9eb7-75e0493ca7a1", APIVersion:"apps/v1", ResourceVersion:"2483", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hwvc4
+++ [0113 10:34:59] Deleting rs
E0113 10:34:59.461905   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
E0113 10:34:59.580443   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:34:59.627787   54599 replica_set.go:534] sync "namespace-1578911698-31004/frontend" failed with replicasets.apps "frontend" not found
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:34:59.709780   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0113 10:35:00.045531   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"cd666f46-4c9d-4b35-8d7e-02ffe91aadac", APIVersion:"apps/v1", ResourceVersion:"2501", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6gh2c
I0113 10:35:00.049400   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"cd666f46-4c9d-4b35-8d7e-02ffe91aadac", APIVersion:"apps/v1", ResourceVersion:"2501", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-c4sgd
I0113 10:35:00.058409   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"cd666f46-4c9d-4b35-8d7e-02ffe91aadac", APIVersion:"apps/v1", ResourceVersion:"2501", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bmtv5
apps.sh:525: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0113 10:35:00] Deleting rs
replicaset.apps "frontend" deleted
E0113 10:35:00.327014   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:00.463405   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:529: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0113 10:35:00.576639   54599 horizontal.go:353] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1578911680-2310
E0113 10:35:00.581853   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:531: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(BE0113 10:35:00.711076   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "frontend-6gh2c" deleted
pod "frontend-bmtv5" deleted
pod "frontend-c4sgd" deleted
apps.sh:534: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:538: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0113 10:35:01.247280   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"bef66dbe-bb18-490b-ac9c-ae03494767ed", APIVersion:"apps/v1", ResourceVersion:"2524", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4wvnc
I0113 10:35:01.251415   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"bef66dbe-bb18-490b-ac9c-ae03494767ed", APIVersion:"apps/v1", ResourceVersion:"2524", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-2hnv4
I0113 10:35:01.255094   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"bef66dbe-bb18-490b-ac9c-ae03494767ed", APIVersion:"apps/v1", ResourceVersion:"2524", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mjlrk
E0113 10:35:01.328467   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:542: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0113 10:35:01.464814   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 3 lines ...
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-4wvnc
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-2hnv4
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-mjlrk
(BE0113 10:35:01.583108   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:01.712177   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:546: Successful describe
Name:         frontend
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-4wvnc
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-2hnv4
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-mjlrk
(BE0113 10:35:02.329952   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-4wvnc
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-2hnv4
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-mjlrk
(BE0113 10:35:02.466476   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE0113 10:35:02.585034   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578911698-31004
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-4wvnc
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-2hnv4
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-mjlrk
(BE0113 10:35:02.714126   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Image:
matched Node:
matched Labels:
matched Status:
matched Controlled By
... skipping 85 lines ...
Events:                <none>
(Bapps.sh:564: Successful get rs frontend {{.spec.replicas}}: 3
(Breplicaset.apps/frontend scaled
E0113 10:35:03.043375   54599 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578911698-31004 /apis/apps/v1/namespaces/namespace-1578911698-31004/replicasets/frontend bef66dbe-bb18-490b-ac9c-ae03494767ed 2535 2 2020-01-13 10:35:01 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc00368b7e8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0113 10:35:03.050974   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"bef66dbe-bb18-490b-ac9c-ae03494767ed", APIVersion:"apps/v1", ResourceVersion:"2535", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-mjlrk
apps.sh:568: Successful get rs frontend {{.spec.replicas}}: 2
(BE0113 10:35:03.331337   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 created
I0113 10:35:03.378709   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911698-31004", Name:"scale-1", UID:"5466d876-59d3-447e-ae7d-116fdb7a188a", APIVersion:"apps/v1", ResourceVersion:"2541", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I0113 10:35:03.384081   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-1-5c5565bcd9", UID:"19292bc8-b6fe-4b9e-9783-910334ee4d0e", APIVersion:"apps/v1", ResourceVersion:"2542", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-59wqx
E0113 10:35:03.468106   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:03.586373   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-2 created
I0113 10:35:03.599972   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911698-31004", Name:"scale-2", UID:"14cda335-8c7b-47ce-9dab-3c8fc2d4d256", APIVersion:"apps/v1", ResourceVersion:"2551", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I0113 10:35:03.605041   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-2-5c5565bcd9", UID:"c0f5d162-b8d1-4483-8bd9-cc7dfae72171", APIVersion:"apps/v1", ResourceVersion:"2552", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-9rdn5
E0113 10:35:03.715169   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-3 created
I0113 10:35:03.826271   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911698-31004", Name:"scale-3", UID:"a3562c43-ba2e-429f-a314-74d47d0447bc", APIVersion:"apps/v1", ResourceVersion:"2563", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I0113 10:35:03.829301   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-3-5c5565bcd9", UID:"7c1ca94a-9178-496c-a284-dfee7cba74d6", APIVersion:"apps/v1", ResourceVersion:"2564", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-cnvc7
apps.sh:574: Successful get deploy scale-1 {{.spec.replicas}}: 1
(Bapps.sh:575: Successful get deploy scale-2 {{.spec.replicas}}: 1
(Bapps.sh:576: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I0113 10:35:04.311954   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911698-31004", Name:"scale-1", UID:"5466d876-59d3-447e-ae7d-116fdb7a188a", APIVersion:"apps/v1", ResourceVersion:"2573", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
deployment.apps/scale-2 scaled
I0113 10:35:04.319217   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-1-5c5565bcd9", UID:"19292bc8-b6fe-4b9e-9783-910334ee4d0e", APIVersion:"apps/v1", ResourceVersion:"2574", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-56ph2
I0113 10:35:04.322757   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911698-31004", Name:"scale-2", UID:"14cda335-8c7b-47ce-9dab-3c8fc2d4d256", APIVersion:"apps/v1", ResourceVersion:"2575", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
I0113 10:35:04.325872   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-2-5c5565bcd9", UID:"c0f5d162-b8d1-4483-8bd9-cc7dfae72171", APIVersion:"apps/v1", ResourceVersion:"2579", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-qcjxb
E0113 10:35:04.332428   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:579: Successful get deploy scale-1 {{.spec.replicas}}: 2
(BE0113 10:35:04.469372   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:580: Successful get deploy scale-2 {{.spec.replicas}}: 2
(BE0113 10:35:04.587837   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:581: Successful get deploy scale-3 {{.spec.replicas}}: 1
(BE0113 10:35:04.716318   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 scaled
I0113 10:35:04.767066   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911698-31004", Name:"scale-1", UID:"5466d876-59d3-447e-ae7d-116fdb7a188a", APIVersion:"apps/v1", ResourceVersion:"2592", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 3
deployment.apps/scale-2 scaled
I0113 10:35:04.770023   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-1-5c5565bcd9", UID:"19292bc8-b6fe-4b9e-9783-910334ee4d0e", APIVersion:"apps/v1", ResourceVersion:"2593", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-p26pv
deployment.apps/scale-3 scaled
I0113 10:35:04.773766   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911698-31004", Name:"scale-2", UID:"14cda335-8c7b-47ce-9dab-3c8fc2d4d256", APIVersion:"apps/v1", ResourceVersion:"2594", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 3
... skipping 2 lines ...
I0113 10:35:04.787681   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-3-5c5565bcd9", UID:"7c1ca94a-9178-496c-a284-dfee7cba74d6", APIVersion:"apps/v1", ResourceVersion:"2603", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-4qhfr
I0113 10:35:04.794891   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"scale-3-5c5565bcd9", UID:"7c1ca94a-9178-496c-a284-dfee7cba74d6", APIVersion:"apps/v1", ResourceVersion:"2603", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-xslq7
apps.sh:584: Successful get deploy scale-1 {{.spec.replicas}}: 3
(Bapps.sh:585: Successful get deploy scale-2 {{.spec.replicas}}: 3
(Bapps.sh:586: Successful get deploy scale-3 {{.spec.replicas}}: 3
(Breplicaset.apps "frontend" deleted
E0113 10:35:05.333735   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
E0113 10:35:05.470607   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0113 10:35:05.580682   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"66e2902e-6b01-4197-929b-34f35258bb47", APIVersion:"apps/v1", ResourceVersion:"2652", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-w84c5
I0113 10:35:05.584153   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"66e2902e-6b01-4197-929b-34f35258bb47", APIVersion:"apps/v1", ResourceVersion:"2652", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zv59x
I0113 10:35:05.585050   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"66e2902e-6b01-4197-929b-34f35258bb47", APIVersion:"apps/v1", ResourceVersion:"2652", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nm2bg
E0113 10:35:05.588998   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:594: Successful get rs frontend {{.spec.replicas}}: 3
(BE0113 10:35:05.717464   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
apps.sh:598: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
apps.sh:602: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice "frontend" deleted
service "frontend-2" deleted
E0113 10:35:06.335209   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:608: Successful get rs frontend {{.metadata.generation}}: 1
(BE0113 10:35:06.472129   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend image updated
E0113 10:35:06.590386   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 2
(BE0113 10:35:06.718939   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend env updated
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 3
(Breplicaset.apps/frontend resource requirements updated
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 4
(Bapps.sh:618: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
E0113 10:35:07.336964   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:622: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:07.473827   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:626: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:07.591667   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:07.720517   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0113 10:35:07.755206   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"e0034161-75e6-4e6b-9752-d4e8cd9dd09a", APIVersion:"apps/v1", ResourceVersion:"2691", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rbw9d
I0113 10:35:07.758498   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"e0034161-75e6-4e6b-9752-d4e8cd9dd09a", APIVersion:"apps/v1", ResourceVersion:"2691", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-p8k4z
I0113 10:35:07.761801   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"e0034161-75e6-4e6b-9752-d4e8cd9dd09a", APIVersion:"apps/v1", ResourceVersion:"2691", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sxc4q
replicaset.apps/redis-slave created
I0113 10:35:07.973443   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"redis-slave", UID:"23cc13df-d85a-4bcc-8985-e89692960d15", APIVersion:"apps/v1", ResourceVersion:"2700", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-88456
I0113 10:35:07.979876   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"redis-slave", UID:"23cc13df-d85a-4bcc-8985-e89692960d15", APIVersion:"apps/v1", ResourceVersion:"2700", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-trv8l
apps.sh:631: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bapps.sh:635: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicaset.apps "frontend" deleted
replicaset.apps "redis-slave" deleted
E0113 10:35:08.338282   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:639: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:08.475258   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:644: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:08.592910   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:08.721801   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0113 10:35:08.779001   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"3ea68e7d-811f-42da-943d-b184338e9c16", APIVersion:"apps/v1", ResourceVersion:"2719", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xr8pr
I0113 10:35:08.784839   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"3ea68e7d-811f-42da-943d-b184338e9c16", APIVersion:"apps/v1", ResourceVersion:"2719", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dk77w
I0113 10:35:08.785320   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911698-31004", Name:"frontend", UID:"3ea68e7d-811f-42da-943d-b184338e9c16", APIVersion:"apps/v1", ResourceVersion:"2719", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kb7fr
apps.sh:647: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:650: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0113 10:35:09.339529   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:654: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE0113 10:35:09.476895   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E0113 10:35:09.594239   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error: required flag(s) "max" not set
E0113 10:35:09.723124   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
+++ [0113 10:35:09] Creating namespace namespace-1578911709-18496
namespace/namespace-1578911709-18496 created
Context "test" modified.
+++ [0113 10:35:10] Testing kubectl(v1:statefulsets)
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:10.340909   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:35:10.456032   51163 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
E0113 10:35:10.478089   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:10.595684   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(BE0113 10:35:10.724419   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(Bstatefulset.apps/nginx scaled
I0113 10:35:10.865108   54599 event.go:278] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1578911709-18496", Name:"nginx", UID:"fa59a371-4bfb-4fe7-b406-d3e5553e207d", APIVersion:"apps/v1", ResourceVersion:"2747", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(BE0113 10:35:11.342256   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx restarted
E0113 10:35:11.479378   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(BE0113 10:35:11.597109   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0113 10:35:11.604093   54599 stateful_set.go:420] StatefulSet has been deleted namespace-1578911709-18496/nginx
E0113 10:35:11.725614   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_statefulset_history_tests
Running command: run_statefulset_history_tests

+++ Running case: test-cmd.run_statefulset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_statefulset_history_tests
+++ [0113 10:35:11] Creating namespace namespace-1578911711-12715
namespace/namespace-1578911711-12715 created
Context "test" modified.
+++ [0113 10:35:12] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:12.343676   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx created
E0113 10:35:12.480862   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578911711-12715"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0113 10:35:12.598449   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx skipped rollback (current template already matches revision 1)
E0113 10:35:12.727043   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx configured
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE0113 10:35:13.345014   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0113 10:35:13.482094   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0113 10:35:13.599917   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578911711-12715"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578911711-12715"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0113 10:35:13.728498   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx will roll back to Pod Template:
  Labels:	app=nginx-statefulset
  Containers:
   nginx:
    Image:	k8s.gcr.io/nginx-slim:0.7
    Port:	80/TCP
... skipping 7 lines ...
  Volumes:	<none>
 (dry run)
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bstatefulset.apps/nginx rolled back
E0113 10:35:14.346493   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0113 10:35:14.483425   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
E0113 10:35:14.601293   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0113 10:35:14.735587   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0113 10:35:15.347921   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0113 10:35:15.396392   54599 stateful_set.go:420] StatefulSet has been deleted namespace-1578911711-12715/nginx
E0113 10:35:15.484834   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_lists_tests
Running command: run_lists_tests
E0113 10:35:15.602585   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_lists_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_lists_tests
+++ [0113 10:35:15] Creating namespace namespace-1578911715-2201
E0113 10:35:15.736979   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911715-2201 created
Context "test" modified.
+++ [0113 10:35:15] Testing kubectl(v1:lists)
service/list-service-test created
deployment.apps/list-deployment-test created
I0113 10:35:16.030622   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911715-2201", Name:"list-deployment-test", UID:"a6142791-159f-4222-a8fd-8d97648a48f0", APIVersion:"apps/v1", ResourceVersion:"2786", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7cd8c5ff6d to 1
... skipping 5 lines ...
Running command: run_multi_resources_tests

+++ Running case: test-cmd.run_multi_resources_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_multi_resources_tests
+++ [0113 10:35:16] Creating namespace namespace-1578911716-12383
E0113 10:35:16.349202   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911716-12383 created
Context "test" modified.
+++ [0113 10:35:16] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
E0113 10:35:16.486034   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:16.604072   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:16.738192   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0113 10:35:16.905207   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"db1cbf08-03ee-4046-8e50-324dd7fdfd14", APIVersion:"v1", ResourceVersion:"2809", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-zgl2g
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.140   <none>        99/TCP    1s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       1s
E0113 10:35:17.350400   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1578911716-12383
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1578911716-12383
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-zgl2g
E0113 10:35:17.487262   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:17.605350   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0113 10:35:17.691509   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"eb33178c-43e7-4368-9d71-feea039ec003", APIVersion:"v1", ResourceVersion:"2821", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-kcwcj
E0113 10:35:17.739553   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0113 10:35:18.351642   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0113 10:35:18.488561   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
E0113 10:35:18.606489   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0113 10:35:18.741000   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:19.352973   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:19.489574   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:19.607675   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0113 10:35:19.617803   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"cdb1f0f2-51f3-497a-8ac7-ebf4ef5cf767", APIVersion:"v1", ResourceVersion:"2848", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-md5sf
E0113 10:35:19.742394   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.28    <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
... skipping 15 lines ...
Name:         mock
Namespace:    namespace-1578911716-12383
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-md5sf
E0113 10:35:20.354321   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0113 10:35:20.460342   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"cb8fadef-6d57-4147-8839-29b209be9a75", APIVersion:"v1", ResourceVersion:"2865", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-x9ncb
E0113 10:35:20.490686   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0113 10:35:20.609000   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0113 10:35:20.743676   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
E0113 10:35:21.355608   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0113 10:35:21.491961   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0113 10:35:21.610298   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0113 10:35:21.746991   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:22.357229   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0113 10:35:22.478361   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"77add741-8584-468f-90af-bd7f2f0b2b6f", APIVersion:"v1", ResourceVersion:"2889", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-z2nbx
E0113 10:35:22.493122   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 10:35:22.611567   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 10:35:22.748352   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.44    <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
Name:              mock
... skipping 13 lines ...
Name:         mock
Namespace:    namespace-1578911716-12383
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 7 lines ...
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-z2nbx
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0113 10:35:23.272631   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"ba064eb8-f7a0-441c-a054-ee508eaff989", APIVersion:"v1", ResourceVersion:"2903", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-xjvnw
E0113 10:35:23.358425   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0113 10:35:23.494247   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:23.612969   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
E0113 10:35:23.749447   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BI0113 10:35:23.989427   54599 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578911698-31004
service/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0113 10:35:24.359705   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0113 10:35:24.495452   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0113 10:35:24.614281   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
E0113 10:35:24.750651   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/mock created
replicationcontroller/mock2 created
I0113 10:35:25.166063   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"bd99bf67-caa9-401e-b90c-e8b68d9e648e", APIVersion:"v1", ResourceVersion:"2924", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-ttkgv
I0113 10:35:25.173002   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock2", UID:"4da810bb-5650-4eca-8029-bc6e63e29901", APIVersion:"v1", ResourceVersion:"2925", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-lxtmd
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BE0113 10:35:25.361141   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    DESIRED   CURRENT   READY   AGE
mock    1         1         0       0s
mock2   1         1         0       0s
E0113 10:35:25.496820   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:         mock
Namespace:    namespace-1578911716-12383
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1578911716-12383
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock2-lxtmd
E0113 10:35:25.615709   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:25.752103   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
replicationcontroller/mock replaced
I0113 10:35:25.840393   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"cbe91eb6-aeb4-42de-822d-9320e42889de", APIVersion:"v1", ResourceVersion:"2942", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-bznx4
replicationcontroller/mock2 replaced
I0113 10:35:25.844954   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock2", UID:"5c7229c7-7cb6-4ae0-80db-12d5ba56ab03", APIVersion:"v1", ResourceVersion:"2944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-v2vwc
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:104: Successful get rc mock2 {{.metadata.labels.status}}: replaced
(Breplicationcontroller/mock edited
replicationcontroller/mock2 edited
E0113 10:35:26.362454   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0113 10:35:26.498222   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(BE0113 10:35:26.616951   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock labeled
replicationcontroller/mock2 labeled
E0113 10:35:26.753110   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(Breplicationcontroller/mock annotated
replicationcontroller/mock2 annotated
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(Breplicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
E0113 10:35:27.363957   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:27.499651   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:27.618271   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
service/mock2 created
E0113 10:35:27.754496   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
mock    ClusterIP   10.0.0.36    <none>        99/TCP    0s
mock2   ClusterIP   10.0.0.133   <none>        99/TCP    0s
Name:              mock
Namespace:         namespace-1578911716-12383
... skipping 20 lines ...
TargetPort:        9949/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
service "mock" deleted
service "mock2" deleted
E0113 10:35:28.365119   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock replaced
service/mock2 replaced
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0113 10:35:28.500877   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:28.619646   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(BE0113 10:35:28.755670   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
service/mock2 edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(Bservice/mock labeled
service/mock2 labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0113 10:35:29.366553   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(BE0113 10:35:29.502129   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
service/mock2 annotated
E0113 10:35:29.620960   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0113 10:35:29.757041   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
service "mock2" deleted
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:30.368023   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0113 10:35:30.496672   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911716-12383", Name:"mock", UID:"0073e86b-9251-4cdf-bb16-f13e7f657b53", APIVersion:"v1", ResourceVersion:"3006", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-4pmvk
E0113 10:35:30.503127   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:30.622290   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 10:35:30.758356   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bservice "mock" deleted
replicationcontroller "mock" deleted
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volumes_tests
Running command: run_persistent_volumes_tests

+++ Running case: test-cmd.run_persistent_volumes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volumes_tests
E0113 10:35:31.369555   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 10:35:31] Creating namespace namespace-1578911731-17778
namespace/namespace-1578911731-17778 created
E0113 10:35:31.504580   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:35:31] Testing persistent volumes
E0113 10:35:31.623680   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:31.759636   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(Bpersistentvolume "pv0001" deleted
E0113 10:35:32.370840   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0002 created
E0113 10:35:32.505900   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(BE0113 10:35:32.625071   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0002" deleted
E0113 10:35:32.761055   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0003 created
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(Bpersistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:33.372231   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
E0113 10:35:33.426839   54599 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
E0113 10:35:33.507200   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BE0113 10:35:33.627514   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:persistentvolume "pv0001" deleted
E0113 10:35:33.762228   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:49: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volume_claims_tests
Running command: run_persistent_volume_claims_tests

+++ Running case: test-cmd.run_persistent_volume_claims_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volume_claims_tests
+++ [0113 10:35:33] Creating namespace namespace-1578911733-12346
namespace/namespace-1578911733-12346 created
Context "test" modified.
+++ [0113 10:35:34] Testing persistent volumes claims
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:34.373533   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-1 created
I0113 10:35:34.405402   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-1", UID:"54b82545-6b31-4403-8701-8f36c913faa0", APIVersion:"v1", ResourceVersion:"3045", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0113 10:35:34.409530   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-1", UID:"54b82545-6b31-4403-8701-8f36c913faa0", APIVersion:"v1", ResourceVersion:"3047", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0113 10:35:34.508447   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(Bpersistentvolumeclaim "myclaim-1" deleted
I0113 10:35:34.624315   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-1", UID:"54b82545-6b31-4403-8701-8f36c913faa0", APIVersion:"v1", ResourceVersion:"3049", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0113 10:35:34.630087   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:34.763512   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-2 created
I0113 10:35:34.840718   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-2", UID:"275441db-908e-4c3a-a189-76a683682318", APIVersion:"v1", ResourceVersion:"3052", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0113 10:35:34.844785   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-2", UID:"275441db-908e-4c3a-a189-76a683682318", APIVersion:"v1", ResourceVersion:"3054", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(Bpersistentvolumeclaim "myclaim-2" deleted
I0113 10:35:35.063097   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-2", UID:"275441db-908e-4c3a-a189-76a683682318", APIVersion:"v1", ResourceVersion:"3056", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-3 created
I0113 10:35:35.273500   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-3", UID:"5571cef7-93e6-4fd0-a6a8-4dda7d26b7c4", APIVersion:"v1", ResourceVersion:"3059", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0113 10:35:35.276788   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-3", UID:"5571cef7-93e6-4fd0-a6a8-4dda7d26b7c4", APIVersion:"v1", ResourceVersion:"3060", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0113 10:35:35.374844   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
(Bpersistentvolumeclaim "myclaim-3" deleted
I0113 10:35:35.489356   54599 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578911733-12346", Name:"myclaim-3", UID:"5571cef7-93e6-4fd0-a6a8-4dda7d26b7c4", APIVersion:"v1", ResourceVersion:"3063", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0113 10:35:35.509649   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
E0113 10:35:35.631403   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_storage_class_tests
Running command: run_storage_class_tests

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [0113 10:35:35] Testing storage class
E0113 10:35:35.764921   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstorageclass.storage.k8s.io/storage-class-name created
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorageclass.storage.k8s.io "storage-class-name" deleted
E0113 10:35:36.376139   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
E0113 10:35:36.511023   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_nodes_tests
Running command: run_nodes_tests

+++ Running case: test-cmd.run_nodes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_nodes_tests
+++ [0113 10:35:36] Testing kubectl(v1:nodes)
E0113 10:35:36.632803   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1375: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0113 10:35:36.766234   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
... skipping 182 lines ...
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(B
matched Name:
E0113 10:35:37.377513   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
matched Pods:
... skipping 40 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0113 10:35:37.512359   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Mon, 13 Jan 2020 10:30:47 +0000
... skipping 34 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0113 10:35:37.633949   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Mon, 13 Jan 2020 10:30:47 +0000
... skipping 33 lines ...
  (Total limits may be over 100 percent, i.e., overcommitted.)
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
(BE0113 10:35:37.767471   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Mon, 13 Jan 2020 10:30:47 +0000
... skipping 39 lines ...
Events:              <none>
(Bcore.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 patched
core.sh:1398: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(Bnode/127.0.0.1 patched
core.sh:1401: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 10:35:38.378749   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
tokenreview.authentication.k8s.io/<unknown> created
E0113 10:35:38.513514   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests

+++ Running case: test-cmd.run_authorization_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_authorization_tests
+++ [0113 10:35:38] Testing authorization
E0113 10:35:38.635193   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
subjectaccessreview.authorization.k8s.io/<unknown> created
E0113 10:35:38.768866   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
subjectaccessreview.authorization.k8s.io/<unknown> created
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  1206  100   904  100   302   441k   147k --:--:-- --:--:-- --:--:--  588k
+++ [0113 10:35:38] "authorization.k8s.io/subjectaccessreviews" returns as expected: {
  "kind": "SubjectAccessReview",
... skipping 58 lines ...
Successful
message:yes
has:yes
Successful
message:yes
has:yes
E0113 10:35:39.380151   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
E0113 10:35:39.514801   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
E0113 10:35:39.636633   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
E0113 10:35:39.770290   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
0
has:0
Successful
message:0
... skipping 6 lines ...
yes
has:Warning: the server doesn't have a resource type 'foo'
Successful
message:Warning: the server doesn't have a resource type 'foo'
yes
has not:Warning: resource 'foo' is not namespace scoped
E0113 10:35:40.381511   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning
Successful
message:Warning: resource 'nodes' is not namespace scoped
yes
has:Warning: resource 'nodes' is not namespace scoped
E0113 10:35:40.516217   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning: resource 'nodes' is not namespace scoped
E0113 10:35:40.637888   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io/testing-CR reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled
	reconciliation required create
... skipping 4 lines ...
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
role.rbac.authorization.k8s.io/testing-R reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
E0113 10:35:40.771563   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
E0113 10:35:41.382972   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
E0113 10:35:41.517370   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
Recording: run_retrieve_multiple_tests
Running command: run_retrieve_multiple_tests

+++ Running case: test-cmd.run_retrieve_multiple_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_retrieve_multiple_tests
E0113 10:35:41.639186   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 10:35:41] Testing kubectl(v1:multiget)
E0113 10:35:41.773023   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:242: Successful get nodes/127.0.0.1 service/kubernetes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:kubernetes:
(B+++ exit code: 0
Recording: run_resource_aliasing_tests
Running command: run_resource_aliasing_tests

+++ Running case: test-cmd.run_resource_aliasing_tests 
... skipping 3 lines ...
namespace/namespace-1578911741-513 created
Context "test" modified.
+++ [0113 10:35:42] Testing resource aliasing
replicationcontroller/cassandra created
I0113 10:35:42.331486   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911741-513", Name:"cassandra", UID:"31f2d6a5-2a6a-4844-8aa5-d05f7e0b6eee", APIVersion:"v1", ResourceVersion:"3093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-prl4l
I0113 10:35:42.336540   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911741-513", Name:"cassandra", UID:"31f2d6a5-2a6a-4844-8aa5-d05f7e0b6eee", APIVersion:"v1", ResourceVersion:"3093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-z6ghn
E0113 10:35:42.384295   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:42.518567   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/cassandra created
E0113 10:35:42.640701   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:42.774332   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Waiting for Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}} : expected: cassandra:cassandra:cassandra:cassandra::, got: cassandra:cassandra:cassandra:cassandra:

discovery.sh:91: FAIL!
Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}
  Expected: cassandra:cassandra:cassandra:cassandra::
  Got:      cassandra:cassandra:cassandra:cassandra:
(B
55 /home/prow/go/src/k8s.io/kubernetes/hack/lib/test.sh
(B
discovery.sh:92: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(Bpod "cassandra-prl4l" deleted
I0113 10:35:43.037429   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911741-513", Name:"cassandra", UID:"31f2d6a5-2a6a-4844-8aa5-d05f7e0b6eee", APIVersion:"v1", ResourceVersion:"3099", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-rxbc4
pod "cassandra-z6ghn" deleted
I0113 10:35:43.047426   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911741-513", Name:"cassandra", UID:"31f2d6a5-2a6a-4844-8aa5-d05f7e0b6eee", APIVersion:"v1", ResourceVersion:"3107", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-kps4h
replicationcontroller "cassandra" deleted
E0113 10:35:43.055446   54599 replica_set.go:534] sync "namespace-1578911741-513/cassandra" failed with replicationcontrollers "cassandra" not found
service "cassandra" deleted
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_explain_tests
+++ [0113 10:35:43] Testing kubectl(v1:explain)
E0113 10:35:43.385568   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 21 lines ...

   status	<Object>
     Most recently observed status of the pod. This data may not be up to date.
     Populated by the system. Read-only. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E0113 10:35:43.520111   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 21 lines ...

   status	<Object>
     Most recently observed status of the pod. This data may not be up to date.
     Populated by the system. Read-only. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E0113 10:35:43.641917   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:43.775896   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

FIELD:    message <string>

DESCRIPTION:
... skipping 44 lines ...
Running command: run_kubectl_sort_by_tests

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [0113 10:35:44] Testing kubectl --sort-by
E0113 10:35:44.386903   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:44.521435   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
No resources found in namespace-1578911741-513 namespace.
E0113 10:35:44.643377   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
No resources found in namespace-1578911741-513 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:44.777329   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
... skipping 25 lines ...
I0113 10:35:45.265175   86503 round_trippers.go:452]     Date: Mon, 13 Jan 2020 10:35:45 GMT
I0113 10:35:45.265292   86503 request.go:1022] Response Body: {"kind":"Table","apiVersion":"meta.k8s.io/v1","metadata":{"selfLink":"/api/v1/namespaces/namespace-1578911741-513/pods","resourceVersion":"3123"},"columnDefinitions":[{"name":"Name","type":"string","format":"name","description":"Name must be unique within a namespace. Is required when creating resources, although some resources may allow a client to request the generation of an appropriate name automatically. Name is primarily intended for creation idempotence and configuration definition. Cannot be updated. More info: http://kubernetes.io/docs/user-guide/identifiers#names","priority":0},{"name":"Ready","type":"string","format":"","description":"The aggregate readiness state of this pod for accepting traffic.","priority":0},{"name":"Status","type":"string","format":"","description":"The aggregate status of the containers in this pod.","priority":0},{"name":"Restarts","type":"integer","format":"","description":"The number of times the containers in this pod have been restarted.","priority":0},{"name":"Age","ty [truncated 3563 chars]
NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:includeObject=Object
get.sh:279: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0113 10:35:45.388433   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E0113 10:35:45.522659   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:45.644735   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:45.778692   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod1 created
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(Bpod/sorted-pod2 created
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(BE0113 10:35:46.389785   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:46.524066   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod3 created
E0113 10:35:46.646072   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BSuccessful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
E0113 10:35:46.780127   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod3:sorted-pod2:sorted-pod1:
has:sorted-pod3:sorted-pod2:sorted-pod1:
Successful
message:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
... skipping 18 lines ...
NAME          AGE
sorted-pod2   1s
sorted-pod1   2s
sorted-pod3   1s
has not:Table
get.sh:325: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BE0113 10:35:47.391031   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "sorted-pod1" force deleted
pod "sorted-pod2" force deleted
pod "sorted-pod3" force deleted
E0113 10:35:47.526238   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:329: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_kubectl_all_namespace_tests
Running command: run_kubectl_all_namespace_tests

+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
E0113 10:35:47.647352   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ command: run_kubectl_all_namespace_tests
+++ [0113 10:35:47] Testing kubectl --all-namespace
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(BE0113 10:35:47.781388   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BNAMESPACE                  NAME        READY   STATUS    RESTARTS   AGE
namespace-1578911741-513   valid-pod   0/1     Pending   0          0s
E0113 10:35:48.392571   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/all-ns-test-1 created
serviceaccount/test created
E0113 10:35:48.527481   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/all-ns-test-2 created
E0113 10:35:48.648838   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount/test created
E0113 10:35:48.782655   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
all-ns-test-1                default   0         0s
all-ns-test-1                test      0         0s
all-ns-test-2                default   0         0s
all-ns-test-2                test      0         0s
... skipping 113 lines ...
namespace-1578911731-17778   default   0         17s
namespace-1578911733-12346   default   0         15s
namespace-1578911741-513     default   0         6s
some-other-random            default   0         8s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
E0113 10:35:49.393867   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:49.529046   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:49.650096   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:49.784274   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:50.395283   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:50.530194   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:50.651296   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:50.785682   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:51.396730   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:51.531384   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:51.652652   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:51.786989   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:52.397971   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:52.532761   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:52.653842   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:52.788546   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:53.399199   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:53.533933   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:53.655122   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:53.789950   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E0113 10:35:54.400521   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:54.535149   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:54.656576   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:54.791324   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:55.401708   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:55.536537   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:55.657840   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:55.792808   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:56.402916   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:56.538093   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:56.659130   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:56.794269   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:57.404351   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:57.539392   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:57.660573   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:57.795520   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:58.405511   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:58.540840   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:58.661827   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:35:58.796746   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 10:35:59.135108   54599 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E0113 10:35:59.406810   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0113 10:35:59.542108   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E0113 10:35:59.662994   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:35:59.798029   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
message:NAME        STATUS     ROLES    AGE     VERSION
127.0.0.1   NotReady   <none>   5m12s   
has not:NAMESPACE
+++ exit code: 0
... skipping 5 lines ...
+++ command: run_template_output_tests
+++ [0113 10:36:00] Testing --template support on commands
+++ [0113 10:36:00] Creating namespace namespace-1578911760-22044
namespace/namespace-1578911760-22044 created
Context "test" modified.
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 10:36:00.408219   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:00.543323   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0113 10:36:00.664424   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "v1",
            "kind": "Pod",
... skipping 94 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0113 10:36:00.799279   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:35: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0113 10:36:01.409533   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:01.560132   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0113 10:36:01.665628   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:scale-1:
has:scale-1:
E0113 10:36:01.800797   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:redis-slave:
has:redis-slave:
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
Successful
... skipping 4 lines ...
message:pi:
has:pi:
Successful
message:127.0.0.1:
has:127.0.0.1:
node/127.0.0.1 untainted
E0113 10:36:02.410743   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0113 10:36:02.505272   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911760-22044", Name:"cassandra", UID:"87f83ec0-327e-4fea-9346-f41e234e0e02", APIVersion:"v1", ResourceVersion:"3179", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-ct5x5
I0113 10:36:02.508639   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911760-22044", Name:"cassandra", UID:"87f83ec0-327e-4fea-9346-f41e234e0e02", APIVersion:"v1", ResourceVersion:"3179", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-9rcmr
E0113 10:36:02.561417   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:cassandra:
has:cassandra:
E0113 10:36:02.666940   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
	reconciliation required create
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
... skipping 3 lines ...
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
Successful
message:testing-CR:testing-CRB:testing-RB:testing-R:
has:testing-CR:testing-CRB:testing-RB:testing-R:
E0113 10:36:02.802092   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:myclusterrole:
has:myclusterrole:
Successful
message:foo:
has:foo:
... skipping 3 lines ...
I0113 10:36:03.154913   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911760-22044", Name:"deploy", UID:"dd23310a-ac86-438a-aab9-8c99040dd2d2", APIVersion:"apps/v1", ResourceVersion:"3188", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
I0113 10:36:03.158964   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911760-22044", Name:"deploy-74bcc58696", UID:"cf3e1b34-4107-4ca8-bdaa-5e44b6241cfb", APIVersion:"apps/v1", ResourceVersion:"3189", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-74lxd
Successful
message:deploy:
has:deploy:
cronjob.batch/pi created
E0113 10:36:03.412291   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:bar:
has:bar:
E0113 10:36:03.562737   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0113 10:36:03.668302   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:myrole:
has:myrole:
E0113 10:36:03.803451   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 7 lines ...
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
I0113 10:36:04.357745   54599 namespace_controller.go:185] Namespace has been deleted all-ns-test-2
E0113 10:36:04.413575   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:kubernetes:
has:kubernetes:
E0113 10:36:04.563975   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0113 10:36:04.669588   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0113 10:36:04.804716   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 9 lines ...
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0113 10:36:05.414802   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://does-not-work
... skipping 6 lines ...
  name: test
current-context: test
kind: Config
preferences: {}
users: null
has:kind: Config
E0113 10:36:05.565329   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0113 10:36:05.670821   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0113 10:36:05.805953   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
Successful
message:deploy:
has:deploy:
... skipping 12 lines ...
I0113 10:36:06.307905   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911760-22044", Name:"cassandra", UID:"87f83ec0-327e-4fea-9346-f41e234e0e02", APIVersion:"v1", ResourceVersion:"3185", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-sfn6c
pod "cassandra-ct5x5" deleted
I0113 10:36:06.316851   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578911760-22044", Name:"cassandra", UID:"87f83ec0-327e-4fea-9346-f41e234e0e02", APIVersion:"v1", ResourceVersion:"3214", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-wrmjk
I0113 10:36:06.323061   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911760-22044", Name:"deploy-74bcc58696", UID:"cf3e1b34-4107-4ca8-bdaa-5e44b6241cfb", APIVersion:"apps/v1", ResourceVersion:"3195", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-k62st
pod "deploy-74bcc58696-74lxd" deleted
pod "valid-pod" deleted
E0113 10:36:06.416374   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "cassandra" deleted
clusterrole.rbac.authorization.k8s.io "myclusterrole" deleted
E0113 10:36:06.566542   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrolebinding.rbac.authorization.k8s.io "foo" deleted
E0113 10:36:06.672165   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "deploy" deleted
+++ exit code: 0
E0113 10:36:06.807299   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_certificates_tests
Running command: run_certificates_tests

+++ Running case: test-cmd.run_certificates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
... skipping 41 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0113 10:36:07.417599   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE0113 10:36:07.568068   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0113 10:36:07.673458   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 10:36:07.808930   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
... skipping 53 lines ...
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE0113 10:36:08.422741   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0113 10:36:08.569141   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 10:36:08.674538   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0113 10:36:08.810220   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
        {
... skipping 54 lines ...
        "selfLink": ""
    }
}
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 10:36:09.423954   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:09.570410   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0113 10:36:09.675716   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0113 10:36:09.811201   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
... skipping 61 lines ...
Running command: run_cluster_management_tests

+++ Running case: test-cmd.run_cluster_management_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
+++ [0113 10:36:10] Testing cluster-management commands
E0113 10:36:10.425277   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0113 10:36:10.571681   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:10.677430   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-1 created
E0113 10:36:10.812509   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-2 created
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(Bnode/127.0.0.1 untainted
E0113 10:36:11.426476   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(BE0113 10:36:11.573022   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 10:36:11.678770   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned (dry run)
E0113 10:36:11.813826   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bnode-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 10:36:12.431051   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 10:36:12.574182   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(Bnode/127.0.0.1 cordoned
E0113 10:36:12.680092   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 drained
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(BE0113 10:36:12.815028   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "test-pod-2" deleted
node/127.0.0.1 uncordoned
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BSuccessful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 10:36:13.432348   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 labeled
E0113 10:36:13.575315   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BE0113 10:36:13.681398   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
E0113 10:36:13.816418   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 9 lines ...
Running command: run_plugins_tests

+++ Running case: test-cmd.run_plugins_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_plugins_tests
+++ [0113 10:36:14] Testing kubectl plugins
E0113 10:36:14.433739   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
E0113 10:36:14.576750   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
E0113 10:36:14.682808   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
E0113 10:36:14.817798   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
has:test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
Successful
message:Client Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.1.634+36e40fb8502930", GitCommit:"36e40fb850293076b415ae3d376f5f81dc897105", GitTreeState:"clean", BuildDate:"2020-01-12T23:07:37Z", GoVersion:"go1.13.5", Compiler:"gc", Platform:"linux/amd64"}
has:Client Version
... skipping 6 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0113 10:36:15] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
certificatesigningrequest.certificates.k8s.io/foo created
E0113 10:36:15.434923   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(BE0113 10:36:15.577961   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(BE0113 10:36:15.684262   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0113 10:36:15.818945   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(Bauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
+++ exit code: 0
Recording: run_wait_tests
Running command: run_wait_tests

+++ Running case: test-cmd.run_wait_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_wait_tests
+++ [0113 10:36:16] Testing kubectl wait
+++ [0113 10:36:16] Creating namespace namespace-1578911776-15052
E0113 10:36:16.436365   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578911776-15052 created
Context "test" modified.
E0113 10:36:16.579196   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-1 created
I0113 10:36:16.656421   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911776-15052", Name:"test-1", UID:"71355f45-b537-412e-b905-0c7c25f9dca7", APIVersion:"apps/v1", ResourceVersion:"3283", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
I0113 10:36:16.671638   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911776-15052", Name:"test-1-6d98955cc9", UID:"3071109d-71dd-4bb9-b7f7-a0cd7c10b4c1", APIVersion:"apps/v1", ResourceVersion:"3284", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-kqnfk
E0113 10:36:16.685257   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-2 created
I0113 10:36:16.755557   54599 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578911776-15052", Name:"test-2", UID:"d84f801b-ddbf-4939-97c7-41246bcca71b", APIVersion:"apps/v1", ResourceVersion:"3293", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
I0113 10:36:16.760043   54599 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578911776-15052", Name:"test-2-65897ff84d", UID:"2ff6afa2-7754-44fe-9409-decf5f141a90", APIVersion:"apps/v1", ResourceVersion:"3294", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-dxx2d
E0113 10:36:16.820255   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE0113 10:36:17.437692   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:17.580477   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:17.686621   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:17.821632   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:18.438993   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:18.581676   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:18.689293   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 10:36:18.823041   54599 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 25 lines ...
I0113 10:36:19.282119   51163 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0113 10:36:19.282135   51163 available_controller.go:398] Shutting down AvailableConditionController
I0113 10:36:19.282162   51163 autoregister_controller.go:164] Shutting down autoregister controller
I0113 10:36:19.282781   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.282834   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.282894   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.282962   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.283016   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283094   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283127   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283189   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283315   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283327   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283340   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283378   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.283401   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.283480   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.283495   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.283540   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.283552   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.283569   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283579   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.283612   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.283624   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.283664   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.283671   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283676   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283693   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.283916   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.283989   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.284057   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.284065   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284087   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284106   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284138   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284143   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284187   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.284196   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.284229   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.284236   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.284319   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284319   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284343   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284355   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.284357   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.284400   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.284416   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284417   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284489   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284511   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.284564   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.284676   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284695   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284709   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284702   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.284763   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.284830   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.284842   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.284908   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.284913   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.284916   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.284916   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.284952   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.284947   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.284984   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285003   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285034   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285051   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285087   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285095   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285098   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285130   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285143   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285130   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285106   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285199   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285209   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285226   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285232   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285235   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285256   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285267   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285287   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285144   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285345   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285388   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285400   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285417   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285457   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285458   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285105   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285591   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285591   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285653   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285715   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285747   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285754   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 10:36:19.285776   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285812   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285879   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285886   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285897   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 10:36:19.285904   51163 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 10:36:19.285937   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285939   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285956   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285969   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285985   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.285995   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286008   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286017   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286025   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286038   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286047   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286063   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286067   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286082   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286090   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286093   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286096   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286112   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286115   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286140   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286159   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286162   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286160   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286165   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286305   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:19.286888   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [0113 10:36:19] Clean up complete
+ make test-integration
W0113 10:36:20.283478   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.283968   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.283972   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.283972   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284211   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284261   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284264   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284289   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284335   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284408   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284628   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284656   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284705   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.284727   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285009   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285021   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285109   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285417   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285419   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285457   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285513   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285517   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285564   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285570   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285570   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285692   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285694   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285736   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285753   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285755   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285794   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285801   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285855   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285910   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285917   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.285962   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286130   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286134   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286175   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286257   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286319   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286378   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286401   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286434   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286452   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286459   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286481   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286510   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286513   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286526   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286561   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286566   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286566   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286570   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286605   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286610   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286621   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286631   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286654   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286868   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286883   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286945   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286950   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286952   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.286976   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:20.287155   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.568434   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.573304   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.588183   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.588183   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.588413   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.592186   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.605531   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.606136   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.607833   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.611710   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.621984   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.651502   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.651504   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.659525   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.675020   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.676223   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.730011   51163 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 10:36:21.731148   51163 clientconn.go:1120] grpc: addrCon