This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 2610 succeeded
Started2020-01-13 19:30
Elapsed28m35s
Revisionmaster
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/d3b37898-7835-4243-8b05-50ce45a20a74/targets/test'}}
resultstorehttps://source.cloud.google.com/results/invocations/d3b37898-7835-4243-8b05-50ce45a20a74/targets/test

Test Failures


k8s.io/kubernetes/test/integration/client TestDynamicClient 8.33s

go test -v k8s.io/kubernetes/test/integration/client -run TestDynamicClient$
=== RUN   TestDynamicClient
I0113 19:50:03.088236  106636 controller.go:123] Shutting down OpenAPI controller
I0113 19:50:03.088257  106636 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0113 19:50:03.088285  106636 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0113 19:50:03.088301  106636 autoregister_controller.go:164] Shutting down autoregister controller
I0113 19:50:03.088317  106636 naming_controller.go:300] Shutting down NamingConditionController
I0113 19:50:03.088330  106636 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0113 19:50:03.088345  106636 customresource_discovery_controller.go:220] Shutting down DiscoveryController
I0113 19:50:03.088367  106636 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
I0113 19:50:03.088381  106636 establishing_controller.go:85] Shutting down EstablishingController
I0113 19:50:03.088394  106636 apiservice_controller.go:106] Shutting down APIServiceRegistrationController
I0113 19:50:03.088685  106636 controller.go:87] Shutting down OpenAPI AggregationController
I0113 19:50:03.088860  106636 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver874119105/proxy-ca.crt
I0113 19:50:03.088901  106636 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver874119105/client-ca.crt
I0113 19:50:03.088994  106636 secure_serving.go:222] Stopped listening on 127.0.0.1:37887
I0113 19:50:03.089012  106636 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0113 19:50:03.089033  106636 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver874119105/apiserver.crt::/tmp/kubernetes-kube-apiserver874119105/apiserver.key
I0113 19:50:03.089064  106636 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver874119105/client-ca.crt
I0113 19:50:03.088410  106636 available_controller.go:398] Shutting down AvailableConditionController
I0113 19:50:03.088417  106636 crd_finalizer.go:276] Shutting down CRDFinalizer
E0113 19:50:03.963295  106636 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
I0113 19:50:04.602664  106636 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver931190316/apiserver.crt, /tmp/kubernetes-kube-apiserver931190316/apiserver.key)
I0113 19:50:04.602709  106636 server.go:596] external host was not specified, using 127.0.0.1
W0113 19:50:04.602720  106636 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0113 19:50:05.100880  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.100915  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.100929  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.101125  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.102314  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.102446  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.102521  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.102622  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.102963  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.103267  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.103409  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.103568  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 19:50:05.103652  106636 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0113 19:50:05.103727  106636 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0113 19:50:05.105109  106636 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0113 19:50:05.105230  106636 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0113 19:50:05.107057  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.107211  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.108676  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.108710  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0113 19:50:05.146416  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 19:50:05.147727  106636 master.go:264] Using reconciler: lease
I0113 19:50:05.148007  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.148045  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.150507  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.150548  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.151954  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.152163  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.153308  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.153348  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.155520  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.155555  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.157191  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.157223  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.159854  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.159899  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.161868  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.161910  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.163667  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.163743  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.165748  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.165788  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.166741  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.166773  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.170379  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.170552  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.172372  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.172411  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.173556  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.173593  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.177976  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.178014  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.179200  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.179242  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.182000  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.182233  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.183466  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.183618  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.184824  106636 rest.go:113] the default service ipfamily for this cluster is: IPv4
I0113 19:50:05.413191  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.413246  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.414851  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.414894  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.416433  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.416463  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.417758  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.417807  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.419462  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.419493  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.421430  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.421471  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.424401  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.424439  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.425728  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.425762  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.428573  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.428610  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.431304  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.431347  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.440008  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.440066  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.441681  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.441722  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.444632  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.444673  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.445911  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.445946  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.449497  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.449540  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.450871  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.451090  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.452105  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.452131  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.454731  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.454768  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.457479  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.457515  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.459730  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.459770  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.461824  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.462028  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.463424  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.463558  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.464856  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.464887  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.466922  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.467057  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.468018  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.468148  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.472154  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.472196  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.474071  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.474112  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.477319  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.477461  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.479023  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.479425  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.481120  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.481260  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.498544  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.498704  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.500611  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.500781  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.502885  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.503182  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.507124  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.507397  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.508772  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.508909  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.510906  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.511049  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.513709  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.514520  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.517958  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.518159  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.520789  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.520927  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.522197  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.522309  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.523858  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.523894  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.527587  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.527785  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.529295  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.529333  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.531798  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.531836  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.534468  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.534506  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.537380  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.537406  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.538893  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.539030  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.540328  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.540355  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.541320  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.541351  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.542627  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.542667  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.545386  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.545421  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.546668  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.546706  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.550530  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.550742  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:05.555931  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:05.555972  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0113 19:50:05.988058  106636 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
I0113 19:50:06.099350  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:06.099436  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0113 19:50:06.336777  106636 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0113 19:50:06.336821  106636 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0113 19:50:06.372462  106636 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0113 19:50:06.372501  106636 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
W0113 19:50:06.374362  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 19:50:06.374578  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:06.374610  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:50:06.375758  106636 client.go:361] parsed scheme: "endpoint"
I0113 19:50:06.375797  106636 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0113 19:50:06.379804  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 19:50:06.382136  106636 aggregator.go:182] Skipping APIService creation for flowcontrol.apiserver.k8s.io/v1alpha1
W0113 19:50:07.090422  106636 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Endpoints ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 19:50:07.090495  106636 reflector.go:340] k8s.io/apiextensions-apiserver/pkg/client/informers/externalversions/factory.go:117: watch of *v1.CustomResourceDefinition ended with: very short watch: k8s.io/apiextensions-apiserver/pkg/client/informers/externalversions/factory.go:117: Unexpected watch close - watch lasted less than a second and no items received
W0113 19:50:07.090548  106636 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.StorageClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 19:50:07.090620  106636 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Secret ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 19:50:07.090656  106636 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0113 19:50:07.090757  106636 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1beta1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
I0113 19:50:10.236957  106636 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver931190316/proxy-ca.crt
I0113 19:50:10.237018  106636 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver931190316/client-ca.crt
I0113 19:50:10.237296  106636 dynamic_serving_content.go:129] Starting serving-cert::/tmp/kubernetes-kube-apiserver931190316/apiserver.crt::/tmp/kubernetes-kube-apiserver931190316/apiserver.key
I0113 19:50:10.237651  106636 secure_serving.go:178] Serving securely on 127.0.0.1:43829
I0113 19:50:10.237778  106636 crd_finalizer.go:264] Starting CRDFinalizer
I0113 19:50:10.237808  106636 tlsconfig.go:241] Starting DynamicServingCertificateController
I0113 19:50:10.237897  106636 available_controller.go:386] Starting AvailableConditionController
I0113 19:50:10.237911  106636 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0113 19:50:10.237938  106636 apiservice_controller.go:94] Starting APIServiceRegistrationController
I0113 19:50:10.237944  106636 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0113 19:50:10.237971  106636 autoregister_controller.go:140] Starting autoregister controller
I0113 19:50:10.237986  106636 cache.go:32] Waiting for caches to sync for autoregister controller
W0113 19:50:10.238956  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 19:50:10.239101  106636 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0113 19:50:10.239110  106636 shared_informer.go:206] Waiting for caches to sync for cluster_authentication_trust_controller
I0113 19:50:10.239140  106636 crdregistration_controller.go:111] Starting crd-autoregister controller
I0113 19:50:10.239154  106636 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver931190316/client-ca.crt
I0113 19:50:10.239157  106636 shared_informer.go:206] Waiting for caches to sync for crd-autoregister
I0113 19:50:10.239200  106636 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver931190316/proxy-ca.crt
I0113 19:50:10.240085  106636 customresource_discovery_controller.go:209] Starting DiscoveryController
I0113 19:50:10.240642  106636 controller.go:81] Starting OpenAPI AggregationController
I0113 19:50:10.241102  106636 controller.go:86] Starting OpenAPI controller
I0113 19:50:10.242015  106636 naming_controller.go:289] Starting NamingConditionController
I0113 19:50:10.242051  106636 establishing_controller.go:74] Starting EstablishingController
I0113 19:50:10.242074  106636 nonstructuralschema_controller.go:185] Starting NonStructuralSchemaConditionController
I0113 19:50:10.242098  106636 apiapproval_controller.go:184] Starting KubernetesAPIApprovalPolicyConformantConditionController
E0113 19:50:10.251437  106636 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /d8e1f1b4-2fb9-4b9c-b5f5-cff0a0ecb6cf/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
E0113 19:50:10.261945  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.272255  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.283335  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.286734  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.307695  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0113 19:50:10.340834  106636 cache.go:39] Caches are synced for AvailableConditionController controller
I0113 19:50:10.340999  106636 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0113 19:50:10.341097  106636 cache.go:39] Caches are synced for autoregister controller
I0113 19:50:10.342021  106636 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0113 19:50:10.343244  106636 shared_informer.go:213] Caches are synced for crd-autoregister 
I0113 19:50:11.236684  106636 controller.go:107] OpenAPI AggregationController: Processing item 
I0113 19:50:11.236733  106636 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0113 19:50:11.236752  106636 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0113 19:50:11.243120  106636 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I0113 19:50:11.247634  106636 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I0113 19:50:11.247660  106636 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
W0113 19:50:11.306291  106636 lease.go:224] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E0113 19:50:11.308416  106636 controller.go:222] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W0113 19:50:11.400657  106636 cacher.go:162] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0113 19:50:11.401092  106636 cacher.go:162] Terminating all watchers from cacher *core.LimitRange
W0113 19:50:11.401305  106636 cacher.go:162] Terminating all watchers from cacher *core.ResourceQuota
W0113 19:50:11.401506  106636 cacher.go:162] Terminating all watchers from cacher *core.Secret
W0113 19:50:11.402038  106636 cacher.go:162] Terminating all watchers from cacher *core.ConfigMap
W0113 19:50:11.402249  106636 cacher.go:162] Terminating all watchers from cacher *core.Namespace
W0113 19:50:11.402523  106636 cacher.go:162] Terminating all watchers from cacher *core.Endpoints
W0113 19:50:11.403101  106636 cacher.go:162] Terminating all watchers from cacher *core.Pod
W0113 19:50:11.403362  106636 cacher.go:162] Terminating all watchers from cacher *core.ServiceAccount
W0113 19:50:11.403838  106636 cacher.go:162] Terminating all watchers from cacher *core.Service
W0113 19:50:11.408855  106636 cacher.go:162] Terminating all watchers from cacher *node.RuntimeClass
W0113 19:50:11.410927  106636 cacher.go:162] Terminating all watchers from cacher *scheduling.PriorityClass
W0113 19:50:11.411648  106636 cacher.go:162] Terminating all watchers from cacher *storage.StorageClass
W0113 19:50:11.412814  106636 cacher.go:162] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0113 19:50:11.413021  106636 cacher.go:162] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0113 19:50:11.413285  106636 cacher.go:162] Terminating all watchers from cacher *apiregistration.APIService
I0113 19:50:11.413550  106636 controller.go:180] Shutting down kubernetes service endpoint reconciler
I0113 19:50:11.413751  106636 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver931190316/client-ca.crt
I0113 19:50:11.413777  106636 controller.go:123] Shutting down OpenAPI controller
I0113 19:50:11.413842  106636 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0113 19:50:11.413867  106636 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0113 19:50:11.413887  106636 customresource_discovery_controller.go:220] Shutting down DiscoveryController
I0113 19:50:11.413903  106636 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0113 19:50:11.413917  106636 establishing_controller.go:85] Shutting down EstablishingController
I0113 19:50:11.413931  106636 naming_controller.go:300] Shutting down NamingConditionController
I0113 19:50:11.413943  106636 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
I0113 19:50:11.413959  106636 autoregister_controller.go:164] Shutting down autoregister controller
I0113 19:50:11.413975  106636 apiservice_controller.go:106] Shutting down APIServiceRegistrationController
I0113 19:50:11.413990  106636 available_controller.go:398] Shutting down AvailableConditionController
I0113 19:50:11.414005  106636 crd_finalizer.go:276] Shutting down CRDFinalizer
I0113 19:50:11.414276  106636 controller.go:87] Shutting down OpenAPI AggregationController
I0113 19:50:11.414302  106636 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver931190316/proxy-ca.crt
I0113 19:50:11.414336  106636 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver931190316/proxy-ca.crt
I0113 19:50:11.414594  106636 secure_serving.go:222] Stopped listening on 127.0.0.1:43829
I0113 19:50:11.414608  106636 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0113 19:50:11.414629  106636 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver931190316/apiserver.crt::/tmp/kubernetes-kube-apiserver931190316/apiserver.key
I0113 19:50:11.414658  106636 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver931190316/client-ca.crt
--- FAIL: TestDynamicClient (8.33s)
    testserver.go:181: runtime-config=map[api/all:true]
    testserver.go:182: Starting kube-apiserver on port 43829...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test47h2w", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test47h2w", UID:"0cb6d7a8-f2f0-4326-a336-3ec9c27332ad", ResourceVersion:"8632", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714541811, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc044a52fa0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc044a52fc0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc048e5b008), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0442b8a20), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc048e5b030)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc048e5b050)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc048e5b058), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc048e5b05c), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test47h2w", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test47h2w", UID:"0cb6d7a8-f2f0-4326-a336-3ec9c27332ad", ResourceVersion:"8632", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714541811, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc044b05f20), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc044b05f00)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc04912f208), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0443a3500), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04912f250)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04912f270)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc04912f1e8), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc04912f1c9), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}

				from junit_da39a3ee5e6b4b0d3255bfef95601890afd80709_20200113-194718.xml

Find in mentions in log files | View test history on testgrid


Show 2610 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 56 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [0113 19:35:34] Call tree:
!!! [0113 19:35:34]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0113 19:35:34]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0113 19:35:34]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [0113 19:35:34]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [0113 19:35:34]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0113 19:35:34] Running kubeadm tests
+++ [0113 19:35:42] Building go targets for linux/amd64:
    cmd/kubeadm
hack/make-rules/test.sh: line 191: KUBE_TEST_API: unbound variable
+++ [0113 19:36:34] Running tests without code coverage
{"Time":"2020-01-13T19:38:11.335931514Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t52.155s\n"}
... skipping 303 lines ...
I0113 19:40:15.572644   51369 controller.go:606] quota admission added evaluator for: endpoints
+++ [0113 19:40:21] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [0113 19:40:56] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I0113 19:40:57.908404   54883 serving.go:313] Generated self-signed cert in-memory
W0113 19:40:58.795356   54883 authentication.go:409] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0113 19:40:58.795401   54883 authentication.go:267] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0113 19:40:58.795408   54883 authentication.go:291] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0113 19:40:58.795421   54883 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0113 19:40:58.795433   54883 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0113 19:40:58.795453   54883 controllermanager.go:161] Version: v1.18.0-alpha.1.638+e265afa2cdfb2b
I0113 19:40:58.796462   54883 secure_serving.go:178] Serving securely on [::]:10257
I0113 19:40:58.796524   54883 tlsconfig.go:241] Starting DynamicServingCertificateController
I0113 19:40:58.796848   54883 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0113 19:40:58.796920   54883 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 17 lines ...
I0113 19:40:59.072487   54883 controllermanager.go:533] Started "horizontalpodautoscaling"
W0113 19:40:59.072532   54883 controllermanager.go:525] Skipping "csrsigning"
W0113 19:40:59.072820   54883 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 19:40:59.072869   54883 controllermanager.go:533] Started "csrapproving"
W0113 19:40:59.073163   54883 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:40:59.073220   54883 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
E0113 19:40:59.073242   54883 core.go:90] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0113 19:40:59.073251   54883 controllermanager.go:525] Skipping "service"
W0113 19:40:59.073636   54883 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:40:59.073652   54883 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0113 19:40:59.073911   54883 controllermanager.go:533] Started "persistentvolume-expander"
W0113 19:40:59.073951   54883 controllermanager.go:512] "tokencleaner" is disabled
I0113 19:40:59.073971   54883 core.go:241] Will not configure cloud provider routes for allocate-node-cidrs: false, configure-cloud-routes: true.
... skipping 129 lines ...
I0113 19:40:59.860210   54883 deployment_controller.go:152] Starting deployment controller
I0113 19:40:59.860225   54883 shared_informer.go:206] Waiting for caches to sync for deployment
I0113 19:40:59.860425   54883 controllermanager.go:533] Started "cronjob"
I0113 19:40:59.860545   54883 cronjob_controller.go:97] Starting CronJob Manager
I0113 19:40:59.860722   54883 controllermanager.go:533] Started "ttl"
I0113 19:40:59.860978   54883 node_lifecycle_controller.go:77] Sending events to api server
E0113 19:40:59.861029   54883 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0113 19:40:59.861038   54883 controllermanager.go:525] Skipping "cloud-node-lifecycle"
I0113 19:40:59.861359   54883 controllermanager.go:533] Started "pv-protection"
I0113 19:40:59.862763   54883 pv_protection_controller.go:81] Starting PV protection controller
I0113 19:40:59.862784   54883 shared_informer.go:206] Waiting for caches to sync for PV protection
I0113 19:40:59.862199   54883 ttl_controller.go:116] Starting TTL controller
I0113 19:40:59.864596   54883 shared_informer.go:206] Waiting for caches to sync for TTL
... skipping 4 lines ...
NAME         TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
kubernetes   ClusterIP   10.0.0.1     <none>        443/TCP   44s
I0113 19:40:59.993632   54883 shared_informer.go:213] Caches are synced for namespace 
Recording: run_kubectl_version_tests
Running command: run_kubectl_version_tests
I0113 19:41:00.000895   54883 shared_informer.go:213] Caches are synced for ClusterRoleAggregator 
E0113 19:41:00.016370   54883 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again

+++ Running case: test-cmd.run_kubectl_version_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_version_tests
+++ [0113 19:41:00] Testing kubectl version
I0113 19:41:00.074792   54883 shared_informer.go:213] Caches are synced for certificate-csrapproving 
... skipping 4 lines ...
  "gitCommit": "e265afa2cdfb2b08c05aa3aeddaacdd26f22746e",
  "gitTreeState": "clean",
  "buildDate": "2020-01-13T17:37:38Z",
  "goVersion": "go1.13.5",
  "compiler": "gc",
  "platform": "linux/amd64"
}W0113 19:41:00.218410   54883 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I0113 19:41:00.266194   54883 shared_informer.go:213] Caches are synced for TTL 
+++ [0113 19:41:00] Testing kubectl version: check client only output matches expected output
I0113 19:41:00.559079   54883 shared_informer.go:213] Caches are synced for resource quota 
I0113 19:41:00.559592   54883 shared_informer.go:213] Caches are synced for job 
I0113 19:41:00.560513   54883 shared_informer.go:213] Caches are synced for deployment 
I0113 19:41:00.571909   54883 shared_informer.go:213] Caches are synced for PVC protection 
... skipping 77 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0113 19:41:04] Creating namespace namespace-1578944464-23209
namespace/namespace-1578944464-23209 created
Context "test" modified.
+++ [0113 19:41:04] Testing RESTMapper
+++ [0113 19:41:05] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 601 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0113 19:41:52] "kubectl patch with resourceVersion 535" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
node/node-v1-test created
W0113 19:41:53.212083   54883 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test replaced
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
(BEdit cancelled, no changes made.
... skipping 22 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 85 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0113 19:42:06] Creating namespace namespace-1578944526-12365
namespace/namespace-1578944526-12365 created
Context "test" modified.
+++ [0113 19:42:06] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0113 19:42:07] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I0113 19:42:10.964160   51369 client.go:361] parsed scheme: "endpoint"
I0113 19:42:10.964206   51369 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0113 19:42:10.967971   51369 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 102 lines ...
Context "test" modified.
+++ [0113 19:42:14] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 30 lines ...
I0113 19:42:18.487021   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944535-11859", Name:"nginx-8484dd655", UID:"6dc208ff-bcf5-4e68-aea4-b5b314831f5b", APIVersion:"apps/v1", ResourceVersion:"635", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-xbfcr
I0113 19:42:18.492657   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944535-11859", Name:"nginx-8484dd655", UID:"6dc208ff-bcf5-4e68-aea4-b5b314831f5b", APIVersion:"apps/v1", ResourceVersion:"635", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-j8ghz
I0113 19:42:18.492903   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944535-11859", Name:"nginx-8484dd655", UID:"6dc208ff-bcf5-4e68-aea4-b5b314831f5b", APIVersion:"apps/v1", ResourceVersion:"635", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-cw5sf
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BI0113 19:42:20.777129   54883 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578944523-6567
Successful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1578944535-11859\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1578944535-11859"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
deployment.apps/nginx configured
I0113 19:42:28.204060   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944535-11859", Name:"nginx", UID:"510479f3-c9f4-446a-87fe-8812d06ec5bd", APIVersion:"apps/v1", ResourceVersion:"678", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0113 19:42:28.210055   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944535-11859", Name:"nginx-668b6c7744", UID:"d42e120f-6597-4b90-9310-4d5234c66e2d", APIVersion:"apps/v1", ResourceVersion:"679", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-lfrrv
I0113 19:42:28.215130   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944535-11859", Name:"nginx-668b6c7744", UID:"d42e120f-6597-4b90-9310-4d5234c66e2d", APIVersion:"apps/v1", ResourceVersion:"679", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-2flmr
I0113 19:42:28.215997   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944535-11859", Name:"nginx-668b6c7744", UID:"d42e120f-6597-4b90-9310-4d5234c66e2d", APIVersion:"apps/v1", ResourceVersion:"679", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-j77z8
Successful
... skipping 169 lines ...
+++ [0113 19:42:36] Creating namespace namespace-1578944556-32282
namespace/namespace-1578944556-32282 created
Context "test" modified.
+++ [0113 19:42:36] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1578944556-32282 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1578944556-32282 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0113 19:42:38.940435   65369 loader.go:375] Config loaded from file:  /tmp/tmp.clEhSV5l6N/.kube/config
I0113 19:42:38.941837   65369 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I0113 19:42:38.972916   65369 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
I0113 19:42:38.974681   65369 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 479 lines ...
Successful
message:NAME    DATA   AGE
one     0      0s
three   0      0s
two     0      0s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [0113 19:42:46] Creating namespace namespace-1578944566-31016
namespace/namespace-1578944566-31016 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 105 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-01-13T19:42:46Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl", "operation":"Update", "time":"2020-01-13T19:42:46Z"}}, "name":"valid-pod", "namespace":"namespace-1578944566-31016", "resourceVersion":"765", "selfLink":"/api/v1/namespaces/namespace-1578944566-31016/pods/valid-pod", "uid":"800f701f-fc88-4137-8f80-ac84c9a6ec6a"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-01-13T19:42:46Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2020-01-13T19:42:46Z"}],"name":"valid-pod","namespace":"namespace-1578944566-31016","resourceVersion":"765","selfLink":"/api/v1/namespaces/namespace-1578944566-31016/pods/valid-pod","uid":"800f701f-fc88-4137-8f80-ac84c9a6ec6a"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-01-13T19:42:46Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl operation:Update time:2020-01-13T19:42:46Z]] name:valid-pod namespace:namespace-1578944566-31016 resourceVersion:765 selfLink:/api/v1/namespaces/namespace-1578944566-31016/pods/valid-pod uid:800f701f-fc88-4137-8f80-ac84c9a6ec6a] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 82 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 79 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [0113 19:42:53] Creating namespace namespace-1578944573-8119
namespace/namespace-1578944573-8119 created
Context "test" modified.
+++ [0113 19:42:53] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [0113 19:42:53] Creating namespace namespace-1578944573-28842
namespace/namespace-1578944573-28842 created
Context "test" modified.
+++ [0113 19:42:54] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0113 19:42:55.108580   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944573-28842", Name:"frontend", UID:"9e142cbe-edf9-4b47-9814-4895816f887b", APIVersion:"apps/v1", ResourceVersion:"822", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-z89pp
I0113 19:42:55.114376   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944573-28842", Name:"frontend", UID:"9e142cbe-edf9-4b47-9814-4895816f887b", APIVersion:"apps/v1", ResourceVersion:"822", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dt6qg
I0113 19:42:55.114424   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944573-28842", Name:"frontend", UID:"9e142cbe-edf9-4b47-9814-4895816f887b", APIVersion:"apps/v1", ResourceVersion:"822", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-cmgcl
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-cmgcl does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-cmgcl does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"3f7b5cf8-ff74-4cf6-915c-a10e56512220","resourceVersion":"845","creationTimestamp":"2020-01-13T19:42:56Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"3f7b5cf8-ff74-4cf6-915c-a10e56512220","resourceVersion":"846","creationTimestamp":"2020-01-13T19:42:56Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"3f7b5cf8-ff74-4cf6-915c-a10e56512220"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 110 lines ...
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          2s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 158 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0113 19:43:11] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 196 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I0113 19:43:47.552911   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-8095", Name:"test1-6cdffdb5b8", UID:"ff6466d2-9f4b-458b-a2b1-e12ef8e682b2", APIVersion:"apps/v1", ResourceVersion:"1033", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-lmhjr
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
W0113 19:43:47.671455   51369 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 19:43:47.672310   54883 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
W0113 19:43:47.798611   51369 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 19:43:47.799887   54883 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 19:43:47] Testing recursive resources
+++ [0113 19:43:47] Creating namespace namespace-1578944627-4388
W0113 19:43:47.926622   51369 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 19:43:47.927829   54883 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578944627-4388 created
Context "test" modified.
W0113 19:43:48.063130   51369 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0113 19:43:48.064292   54883 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E0113 19:43:48.673525   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:48.801216   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 19:43:48.929196   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:49.065481   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:49.674593   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Name:         busybox0
Namespace:    namespace-1578944627-4388
Priority:     0
Node:         <none>
Labels:       app=busybox0
... skipping 153 lines ...
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 19:43:49.802500   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:49.930340   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:43:50.066712   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E0113 19:43:50.676110   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:43:50.803949   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:43:50.931961   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I0113 19:43:51.014115   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944627-4388", Name:"nginx", UID:"5e4c24e9-5ec5-4666-bd88-bee9d7b6c299", APIVersion:"apps/v1", ResourceVersion:"1058", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0113 19:43:51.020771   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-4388", Name:"nginx-f87d999f7", UID:"6fc58263-223c-4c81-ba5e-93042563c636", APIVersion:"apps/v1", ResourceVersion:"1059", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-vxdg5
I0113 19:43:51.023600   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-4388", Name:"nginx-f87d999f7", UID:"6fc58263-223c-4c81-ba5e-93042563c636", APIVersion:"apps/v1", ResourceVersion:"1059", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-xzs6g
I0113 19:43:51.025319   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-4388", Name:"nginx-f87d999f7", UID:"6fc58263-223c-4c81-ba5e-93042563c636", APIVersion:"apps/v1", ResourceVersion:"1059", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-zds9r
E0113 19:43:51.068181   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bgeneric-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BI0113 19:43:51.311153   54883 namespace_controller.go:185] Namespace has been deleted non-native-resources
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
... skipping 38 lines ...
      schedulerName: default-scheduler
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:extensions/v1beta1
deployment.apps "nginx" deleted
E0113 19:43:51.677374   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:51.805152   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:43:51.933246   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 19:43:52.069461   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:52.678712   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0113 19:43:52.806445   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0113 19:43:52.934509   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:53.070942   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0113 19:43:53.595998   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944627-4388", Name:"busybox0", UID:"1f049943-82dd-4130-9821-3d8f08029dd3", APIVersion:"v1", ResourceVersion:"1090", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-mcdrz
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0113 19:43:53.605201   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944627-4388", Name:"busybox1", UID:"ea05d022-82d8-4006-837f-7d407e0d912a", APIVersion:"v1", ResourceVersion:"1094", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-f4sdt
E0113 19:43:53.679900   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:53.807882   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:53.935784   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0113 19:43:54.072165   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
E0113 19:43:54.680891   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:54.809006   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0113 19:43:54.936983   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0113 19:43:55.073419   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0113 19:43:55.682086   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0113 19:43:55.810197   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:43:55.866432   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944627-4388", Name:"busybox0", UID:"1f049943-82dd-4130-9821-3d8f08029dd3", APIVersion:"v1", ResourceVersion:"1113", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-zn2s7
I0113 19:43:55.879292   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944627-4388", Name:"busybox1", UID:"ea05d022-82d8-4006-837f-7d407e0d912a", APIVersion:"v1", ResourceVersion:"1117", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-8lm8p
E0113 19:43:55.938234   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(BE0113 19:43:56.074615   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:43:56.683590   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:43:56.811460   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment created
I0113 19:43:56.818142   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944627-4388", Name:"nginx1-deployment", UID:"bd74fd80-12d3-4359-a9e2-8dcc1d81b131", APIVersion:"apps/v1", ResourceVersion:"1135", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0113 19:43:56.824215   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-4388", Name:"nginx1-deployment-7bdbbfb5cf", UID:"bb21d51b-9007-47b9-ac80-5bacc5918501", APIVersion:"apps/v1", ResourceVersion:"1136", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-qvtpd
I0113 19:43:56.824600   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944627-4388", Name:"nginx0-deployment", UID:"73f57c7a-e81e-474c-b6d6-1f8744d731e7", APIVersion:"apps/v1", ResourceVersion:"1137", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I0113 19:43:56.833336   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-4388", Name:"nginx1-deployment-7bdbbfb5cf", UID:"bb21d51b-9007-47b9-ac80-5bacc5918501", APIVersion:"apps/v1", ResourceVersion:"1136", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-b856m
I0113 19:43:56.833402   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-4388", Name:"nginx0-deployment-57c6bff7f6", UID:"ecd7aa56-6b60-4ae6-9fc8-6697a53a144f", APIVersion:"apps/v1", ResourceVersion:"1141", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-p7wjl
I0113 19:43:56.838263   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944627-4388", Name:"nginx0-deployment-57c6bff7f6", UID:"ecd7aa56-6b60-4ae6-9fc8-6697a53a144f", APIVersion:"apps/v1", ResourceVersion:"1141", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-nj4fw
E0113 19:43:56.939555   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(BE0113 19:43:57.076091   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0113 19:43:57.684869   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment resumed
deployment.apps/nginx0-deployment resumed
E0113 19:43:57.812584   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:410: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0113 19:43:57.940636   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0113 19:43:58.077531   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0113 19:43:58.686170   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:43:58.813895   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:43:58.941839   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:43:59.078745   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0113 19:43:59.457208   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944627-4388", Name:"busybox0", UID:"98f50516-4cd9-4314-af83-c9ed71e542a1", APIVersion:"v1", ResourceVersion:"1185", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-xxx8s
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0113 19:43:59.465048   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944627-4388", Name:"busybox1", UID:"d9c0010b-24ab-4b0b-88d9-14529f35da2d", APIVersion:"v1", ResourceVersion:"1187", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-7d55x
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0113 19:43:59.687408   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:no rollbacker has been implemented for "ReplicationController"
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0113 19:43:59.815166   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
E0113 19:43:59.943086   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
E0113 19:44:00.079966   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0113 19:44:00.688819   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:00.816496   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:00.944364   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:01.081336   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0113 19:44:01] Testing kubectl(v1:namespaces)
namespace/my-namespace created
core.sh:1314: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
E0113 19:44:01.690090   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:01.817807   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:01.945620   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:02.082723   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:02.691695   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:02.819223   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:02.947002   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:03.083960   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:03.159456   54883 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0113 19:44:03.159523   54883 shared_informer.go:213] Caches are synced for garbage collector 
I0113 19:44:03.227039   54883 shared_informer.go:206] Waiting for caches to sync for resource quota
I0113 19:44:03.227116   54883 shared_informer.go:213] Caches are synced for resource quota 
E0113 19:44:03.692992   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:03.820488   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:03.948247   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:04.085564   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:04.694163   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:04.821746   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:04.949484   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:05.087040   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:05.695657   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:05.823194   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:05.950720   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:06.088411   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
E0113 19:44:06.697105   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
E0113 19:44:06.824390   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace created
E0113 19:44:06.952079   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1323: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BE0113 19:44:07.089540   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578944461-24573" deleted
namespace "namespace-1578944464-23209" deleted
... skipping 26 lines ...
namespace "namespace-1578944578-15259" deleted
namespace "namespace-1578944580-19858" deleted
namespace "namespace-1578944582-7012" deleted
namespace "namespace-1578944584-29645" deleted
namespace "namespace-1578944627-4388" deleted
namespace "namespace-1578944627-8095" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578944461-24573" deleted
... skipping 27 lines ...
namespace "namespace-1578944578-15259" deleted
namespace "namespace-1578944580-19858" deleted
namespace "namespace-1578944582-7012" deleted
namespace "namespace-1578944584-29645" deleted
namespace "namespace-1578944627-4388" deleted
namespace "namespace-1578944627-8095" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
core.sh:1335: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1339: Successful get namespaces/other {{.metadata.name}}: other
(BE0113 19:44:07.698481   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1343: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:44:07.825739   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0113 19:44:07.953312   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:08.091141   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1347: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1349: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1356: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E0113 19:44:08.699998   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1360: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:44:08.827229   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "other" deleted
E0113 19:44:08.954664   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:09.092715   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:09.204344   54883 horizontal.go:353] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1578944627-4388
I0113 19:44:09.211875   54883 horizontal.go:353] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1578944627-4388
E0113 19:44:09.701360   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:09.828729   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:09.956230   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:10.094523   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:10.703586   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:10.830279   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:10.957562   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:11.096287   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:11.704515   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:11.831632   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:11.959089   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:12.099355   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:12.709640   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:12.835444   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:12.960599   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:13.102632   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:13.711430   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:13.836727   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:13.961540   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_secrets_test
Running command: run_secrets_test
E0113 19:44:14.104170   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_secrets_test 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_secrets_test
+++ [0113 19:44:14] Creating namespace namespace-1578944654-27715
namespace/namespace-1578944654-27715 created
... skipping 35 lines ...
metadata:
  creationTimestamp: null
  name: test
has not:example.com
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(Bnamespace/test-secrets created
E0113 19:44:14.712818   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(BE0113 19:44:14.838057   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:44:14.963147   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0113 19:44:15.105485   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(Bsecret "test-secret" deleted
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:44:15.714235   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0113 19:44:15.839403   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0113 19:44:15.965196   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(BE0113 19:44:16.107234   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0113 19:44:16.715611   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:16.727783   54883 namespace_controller.go:185] Namespace has been deleted my-namespace
core.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(BE0113 19:44:16.840679   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E0113 19:44:16.966600   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0113 19:44:17.108607   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(BI0113 19:44:17.368770   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944461-24573
secret "test-secret" deleted
I0113 19:44:17.383636   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944482-16693
I0113 19:44:17.406115   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944464-23209
... skipping 7 lines ...
I0113 19:44:17.634059   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944500-18994
I0113 19:44:17.656444   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944501-12675
I0113 19:44:17.659847   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944516-17596
I0113 19:44:17.683935   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944521-2212
I0113 19:44:17.694686   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944521-21255
I0113 19:44:17.708412   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944518-15361
E0113 19:44:17.716925   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:17.720351   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944523-6567
I0113 19:44:17.721815   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944527-31675
I0113 19:44:17.722435   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944526-12365
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BI0113 19:44:17.807019   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944531-27019
E0113 19:44:17.841999   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BE0113 19:44:17.968007   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:17.999650   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944534-31013
core.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(BI0113 19:44:18.023718   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944554-32352
I0113 19:44:18.040341   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944555-13952
I0113 19:44:18.075014   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944556-32282
I0113 19:44:18.086384   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944573-8119
I0113 19:44:18.088738   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944578-11115
I0113 19:44:18.090526   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944566-31016
secret "secret-string-data" deleted
I0113 19:44:18.098221   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944573-28842
I0113 19:44:18.108422   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944535-11859
E0113 19:44:18.109828   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:18.123959   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944578-15259
core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0113 19:44:18.246497   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944580-19858
I0113 19:44:18.256712   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944584-29645
I0113 19:44:18.275936   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944582-7012
I0113 19:44:18.309783   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944627-8095
I0113 19:44:18.381796   54883 namespace_controller.go:185] Namespace has been deleted namespace-1578944627-4388
secret "test-secret" deleted
namespace "test-secrets" deleted
E0113 19:44:18.718242   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:18.843173   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:18.969300   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:18.978024   54883 namespace_controller.go:185] Namespace has been deleted other
E0113 19:44:19.111387   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:19.719668   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:19.844405   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:19.970513   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:20.112485   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:20.721385   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:20.846075   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:20.972801   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:21.114171   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:21.723451   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:21.848032   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:21.974412   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:22.116957   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:22.725123   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:22.849767   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:22.976432   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:23.118613   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:23.726535   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0113 19:44:23.851855   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
E0113 19:44:23.978720   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 19:44:23] Creating namespace namespace-1578944663-6886
namespace/namespace-1578944663-6886 created
E0113 19:44:24.119941   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:44:24] Testing configmaps
configmap/test-configmap created
E0113 19:44:24.728833   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:24.853557   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(BE0113 19:44:24.980584   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0113 19:44:25.121640   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(Bnamespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(BE0113 19:44:25.730247   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(BE0113 19:44:25.855239   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(BE0113 19:44:25.983070   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-configmap created
E0113 19:44:26.123647   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-binary-configmap created
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE0113 19:44:26.732660   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:26.857056   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:26.984879   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0113 19:44:27.125395   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E0113 19:44:27.733959   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:27.858264   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:27.986135   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:28.126676   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:28.729546   54883 namespace_controller.go:185] Namespace has been deleted test-secrets
E0113 19:44:28.735322   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:28.859585   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:28.987359   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:29.128231   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:29.736572   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:29.860811   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:29.988460   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:30.129502   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:30.737864   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:30.862018   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:30.989716   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:31.130660   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:31.739327   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:31.863464   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:31.991083   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:32.131897   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0113 19:44:32] Creating namespace namespace-1578944672-714
namespace/namespace-1578944672-714 created
E0113 19:44:32.740513   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:44:32] Testing client config
E0113 19:44:32.865728   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0113 19:44:32.992257   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0113 19:44:33.133056   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests
E0113 19:44:33.741749   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [0113 19:44:33] Creating namespace namespace-1578944673-2646
E0113 19:44:33.867057   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578944673-2646 created
Context "test" modified.
E0113 19:44:33.993369   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 19:44:33] Testing service accounts
core.sh:828: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-service-accounts\" }}found{{end}}{{end}}:: :
(BE0113 19:44:34.134237   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-service-accounts created
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(Bserviceaccount/test-service-account created
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E0113 19:44:34.742871   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:34.868339   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:34.994568   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:35.135647   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:35.744217   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:35.869580   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:35.996177   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:36.136888   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:36.745767   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:36.871088   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:36.997523   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:37.138350   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:37.475992   54883 namespace_controller.go:185] Namespace has been deleted test-configmaps
E0113 19:44:37.747276   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:37.872381   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:37.998696   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:38.139829   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:38.748533   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:38.873527   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:39.000232   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:39.141255   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:39.750904   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:39.874770   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
E0113 19:44:40.001437   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ command: run_job_tests
+++ [0113 19:44:40] Creating namespace namespace-1578944680-13456
namespace/namespace-1578944680-13456 created
E0113 19:44:40.142513   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:44:40] Testing job
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(Bnamespace/test-jobs created
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
(Bkubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/pi created
E0113 19:44:40.752279   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:39: Successful get cronjob/pi --namespace=test-jobs {{.metadata.name}}: pi
(BE0113 19:44:40.876037   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME   SCHEDULE       SUSPEND   ACTIVE   LAST SCHEDULE   AGE
pi     59 23 31 2 *   False     0        <none>          0s
E0113 19:44:41.002732   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:                          pi
Namespace:                     test-jobs
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 13 lines ...
    Environment:     <none>
    Mounts:          <none>
  Volumes:           <none>
Last Schedule Time:  <unset>
Active Jobs:         <none>
Events:              <none>
E0113 19:44:41.143809   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:job.batch/test-job
has:job.batch/test-job
batch.sh:48: Successful get jobs {{range.items}}{{.metadata.name}}{{end}}: 
(BI0113 19:44:41.360379   54883 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"93f4c608-8a2d-4ecf-bf33-f8258ec0e4e7", APIVersion:"batch/v1", ResourceVersion:"1533", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-l2k85
job.batch/test-job created
... skipping 8 lines ...
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Mon, 13 Jan 2020 19:44:41 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=93f4c608-8a2d-4ecf-bf33-f8258ec0e4e7
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 12 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  0s    job-controller  Created pod: test-job-l2k85
E0113 19:44:41.753731   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job" deleted
E0113 19:44:41.877328   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "pi" deleted
E0113 19:44:42.004215   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "test-jobs" deleted
E0113 19:44:42.145258   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:42.755193   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:42.878544   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:43.005515   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:43.146570   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:43.756654   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:43.879844   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:44.007304   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:44.147883   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:44.757941   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:44.844846   54883 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E0113 19:44:44.881102   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:45.008543   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:45.149190   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:45.759482   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:45.882417   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:46.009979   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:46.150442   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:46.760692   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:46.883743   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:47.011096   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:47.151839   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_job_tests
+++ [0113 19:44:47] Creating namespace namespace-1578944687-24721
namespace/namespace-1578944687-24721 created
Context "test" modified.
I0113 19:44:47.590095   54883 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578944687-24721", Name:"test-job", UID:"f5720b8a-0647-42db-911f-87dfdcb8df53", APIVersion:"batch/v1", ResourceVersion:"1555", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-2lvvt
job.batch/test-job created
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(BE0113 19:44:47.761872   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job" deleted
E0113 19:44:47.885094   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:47.923316   54883 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578944687-24721", Name:"test-job-pi", UID:"6aee8140-3fef-4186-a72b-bf3e93a76740", APIVersion:"batch/v1", ResourceVersion:"1562", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-m9wmk
job.batch/test-job-pi created
E0113 19:44:48.012352   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(BE0113 19:44:48.153117   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job-pi" deleted
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/test-pi created
I0113 19:44:48.436399   54883 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578944687-24721", Name:"my-pi", UID:"fc202499-f8a1-47ef-89bc-b1455adb2162", APIVersion:"batch/v1", ResourceVersion:"1570", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-x87l4
job.batch/my-pi created
Successful
message:[perl -Mbignum=bpi -wle print bpi(10)]
has:perl -Mbignum=bpi -wle print bpi(10)
job.batch "my-pi" deleted
cronjob.batch "test-pi" deleted
E0113 19:44:48.763215   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_pod_templates_tests
Running command: run_pod_templates_tests

+++ Running case: test-cmd.run_pod_templates_tests 
E0113 19:44:48.886443   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_pod_templates_tests
+++ [0113 19:44:48] Creating namespace namespace-1578944688-28334
namespace/namespace-1578944688-28334 created
E0113 19:44:49.013532   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:44:49] Testing pod templates
E0113 19:44:49.154392   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1421: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0113 19:44:49.482224   51369 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
core.sh:1425: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BNAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
E0113 19:44:49.764502   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:49.888181   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:50.014702   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1433: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
E0113 19:44:50.155562   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1437: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_service_tests
Running command: run_service_tests

+++ Running case: test-cmd.run_service_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_tests
Context "test" modified.
+++ [0113 19:44:50] Testing kubectl(v1:services)
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 19:44:50.765856   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E0113 19:44:50.889459   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0113 19:44:51.015994   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
matched Port:
matched Endpoints:
... skipping 10 lines ...
IP:                10.0.0.91
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0113 19:44:51.156655   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:866: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 72 lines ...
IP:                10.0.0.91
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0113 19:44:51.766795   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 18 lines ...
IP:                10.0.0.91
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0113 19:44:51.890522   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 16 lines ...
Type:              ClusterIP
IP:                10.0.0.91
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
(BE0113 19:44:52.017252   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 20 lines ...
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(Bcore.sh:882: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BI0113 19:44:52.153947   54883 namespace_controller.go:185] Namespace has been deleted test-jobs
E0113 19:44:52.157871   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: null
  labels:
    app: redis
... skipping 33 lines ...
  type: ClusterIP
status:
  loadBalancer: {}
service/redis-master selector updated
core.sh:890: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: padawan:
(Bservice/redis-master selector updated
E0113 19:44:52.768187   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:894: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0113 19:44:52.891997   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2020-01-13T19:44:50Z"
  labels:
    app: redis
... skipping 13 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
E0113 19:44:53.018525   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0113 19:44:53.159215   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
core.sh:911: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0113 19:44:53.769609   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
E0113 19:44:53.893263   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:918: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 19:44:54.020043   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:922: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 19:44:54.160466   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
core.sh:926: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bcore.sh:930: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0113 19:44:54.770752   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test created
E0113 19:44:54.894268   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:951: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(BE0113 19:44:55.021566   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test replaced
E0113 19:44:55.161846   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:958: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
service "service-v1-test" deleted
core.sh:966: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:970: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 19:44:55.772081   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:55.895698   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E0113 19:44:56.023208   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:56.163221   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-slave created
core.sh:975: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BSuccessful
message:NAME           RSRC
kubernetes     144
redis-master   1611
redis-slave    1614
has:redis-master
core.sh:985: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(Bservice "redis-master" deleted
service "redis-slave" deleted
E0113 19:44:56.773238   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:992: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0113 19:44:56.896913   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:996: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/beep-boop created
E0113 19:44:57.024409   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1000: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(BE0113 19:44:57.164826   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1004: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bservice "beep-boop" deleted
core.sh:1011: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1015: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bkubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I0113 19:44:57.721795   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"ab4b90d2-1a7d-4f54-8158-ae05f45f91e8", APIVersion:"apps/v1", ResourceVersion:"1628", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I0113 19:44:57.729299   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"ef568c30-6970-49d0-81b5-a16a86a89e4e", APIVersion:"apps/v1", ResourceVersion:"1629", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-sx58c
I0113 19:44:57.730535   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"ef568c30-6970-49d0-81b5-a16a86a89e4e", APIVersion:"apps/v1", ResourceVersion:"1629", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-fgggv
service/testmetadata created
deployment.apps/testmetadata created
E0113 19:44:57.774514   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1019: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(BE0113 19:44:57.898433   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1020: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(BE0113 19:44:58.025791   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/exposemetadata exposed
E0113 19:44:58.166213   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1026: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(Bservice "exposemetadata" deleted
service "testmetadata" deleted
deployment.apps "testmetadata" deleted
+++ exit code: 0
Recording: run_daemonset_tests
Running command: run_daemonset_tests

+++ Running case: test-cmd.run_daemonset_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_tests
+++ [0113 19:44:58] Creating namespace namespace-1578944698-10697
namespace/namespace-1578944698-10697 created
E0113 19:44:58.776033   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:44:58] Testing kubectl(v1:daemonsets)
E0113 19:44:58.899656   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:44:59.027183   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:44:59.167499   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:44:59.173910   51369 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I0113 19:44:59.186504   51369 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(Bdaemonset.apps/bind configured
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0113 19:44:59.777349   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind image updated
E0113 19:44:59.901019   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(BE0113 19:45:00.028482   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind env updated
E0113 19:45:00.168893   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(Bdaemonset.apps/bind resource requirements updated
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(Bdaemonset.apps/bind restarted
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
(BE0113 19:45:00.778599   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps "bind" deleted
+++ exit code: 0
E0113 19:45:00.902449   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_daemonset_history_tests
Running command: run_daemonset_history_tests

+++ Running case: test-cmd.run_daemonset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_history_tests
+++ [0113 19:45:01] Creating namespace namespace-1578944701-5242
E0113 19:45:01.029590   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578944701-5242 created
E0113 19:45:01.170152   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:45:01] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdaemonset.apps/bind created
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578944701-5242"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0113 19:45:01.780175   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind skipped rollback (current template already matches revision 1)
E0113 19:45:01.903974   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0113 19:45:02.030981   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0113 19:45:02.171491   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0113 19:45:02.781430   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578944701-5242"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578944701-5242"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0113 19:45:02.905372   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind will roll back to Pod Template:
  Labels:	service=bind
  Containers:
   kubernetes-pause:
    Image:	k8s.gcr.io/pause:2.0
    Port:	<none>
    Host Port:	<none>
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
E0113 19:45:03.032398   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:83: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0113 19:45:03.172674   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
E0113 19:45:03.531431   54883 daemon_controller.go:291] namespace-1578944701-5242/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1578944701-5242", SelfLink:"/apis/apps/v1/namespaces/namespace-1578944701-5242/daemonsets/bind", UID:"92a3577d-b053-4fad-8b5d-0865a9490380", ResourceVersion:"1697", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714541501, loc:(*time.Location)(0x6b12b60)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1578944701-5242\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc00160c3e0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00160c460)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc00160c4a0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00160c4e0)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc00160c540), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002eb30a8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc002cd8000), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc00160c560), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc00020e658)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002eb30fc)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0113 19:45:03.782715   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0113 19:45:03.906654   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
E0113 19:45:04.033672   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0113 19:45:04.174479   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E0113 19:45:04.306210   54883 daemon_controller.go:291] namespace-1578944701-5242/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1578944701-5242", SelfLink:"/apis/apps/v1/namespaces/namespace-1578944701-5242/daemonsets/bind", UID:"92a3577d-b053-4fad-8b5d-0865a9490380", ResourceVersion:"1700", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714541501, loc:(*time.Location)(0x6b12b60)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1578944701-5242\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc000bc7ec0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc000bc7ee0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc000bc7f00), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc000bc7f20)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc000bc7f40), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002f996c8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc002600ae0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc000bc7f60), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc0018fc1f0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002f9971c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
E0113 19:45:04.784605   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_rc_tests
Running command: run_rc_tests

+++ Running case: test-cmd.run_rc_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rc_tests
+++ [0113 19:45:04] Creating namespace namespace-1578944704-29140
E0113 19:45:04.908101   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578944704-29140 created
E0113 19:45:05.035220   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:45:05] Testing kubectl(v1:replicationcontrollers)
E0113 19:45:05.175743   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1052: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0113 19:45:05.447362   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"6aae8343-a11c-4a53-9626-954f3c95f259", APIVersion:"v1", ResourceVersion:"1710", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g8qcr
I0113 19:45:05.451167   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"6aae8343-a11c-4a53-9626-954f3c95f259", APIVersion:"v1", ResourceVersion:"1710", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9v9qz
I0113 19:45:05.453491   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"6aae8343-a11c-4a53-9626-954f3c95f259", APIVersion:"v1", ResourceVersion:"1710", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zwmsr
replicationcontroller "frontend" deleted
core.sh:1057: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:05.785810   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1061: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:05.909318   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
E0113 19:45:06.036278   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:45:06.038059   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"38048af8-a1f2-43ff-9c76-0bd6047d83f3", APIVersion:"v1", ResourceVersion:"1727", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nvrrr
I0113 19:45:06.041574   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"38048af8-a1f2-43ff-9c76-0bd6047d83f3", APIVersion:"v1", ResourceVersion:"1727", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g5hfn
I0113 19:45:06.045862   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"38048af8-a1f2-43ff-9c76-0bd6047d83f3", APIVersion:"v1", ResourceVersion:"1727", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-szlnh
core.sh:1065: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0113 19:45:06.176929   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 4 lines ...
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-nvrrr
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-g5hfn
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-szlnh
(B
E0113 19:45:06.787219   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
... skipping 5 lines ...
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-nvrrr
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-g5hfn
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-szlnh
(BE0113 19:45:06.910582   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-nvrrr
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-g5hfn
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-szlnh
(BE0113 19:45:07.037444   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE0113 19:45:07.178113   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578944704-29140
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1085: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0113 19:45:07.490262   54883 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578944704-29140 /api/v1/namespaces/namespace-1578944704-29140/replicationcontrollers/frontend 38048af8-a1f2-43ff-9c76-0bd6047d83f3 1738 2 2020-01-13 19:45:06 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-13 19:45:06 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-13 19:45:06 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002b878c8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0113 19:45:07.495971   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"38048af8-a1f2-43ff-9c76-0bd6047d83f3", APIVersion:"v1", ResourceVersion:"1738", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-g5hfn
core.sh:1089: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1093: Successful get rc frontend {{.spec.replicas}}: 2
(BE0113 19:45:07.788566   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: Expected replicas to be 3, was 2
E0113 19:45:07.911987   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1097: Successful get rc frontend {{.spec.replicas}}: 2
(BE0113 19:45:08.038690   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1101: Successful get rc frontend {{.spec.replicas}}: 2
(BE0113 19:45:08.179552   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
I0113 19:45:08.194058   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"38048af8-a1f2-43ff-9c76-0bd6047d83f3", APIVersion:"v1", ResourceVersion:"1744", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mcvqd
core.sh:1105: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1109: Successful get rc frontend {{.spec.replicas}}: 3
(BE0113 19:45:08.551945   54883 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578944704-29140 /api/v1/namespaces/namespace-1578944704-29140/replicationcontrollers/frontend 38048af8-a1f2-43ff-9c76-0bd6047d83f3 1749 4 2020-01-13 19:45:06 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-13 19:45:06 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-13 19:45:08 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0029a8e78 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
replicationcontroller/frontend scaled
I0113 19:45:08.557592   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"38048af8-a1f2-43ff-9c76-0bd6047d83f3", APIVersion:"v1", ResourceVersion:"1749", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-mcvqd
core.sh:1113: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
E0113 19:45:08.789699   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:08.913407   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master created
I0113 19:45:09.013798   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-master", UID:"601ca89b-cbc4-4f51-ab21-c03afd6a1249", APIVersion:"v1", ResourceVersion:"1762", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-lxq4t
E0113 19:45:09.040032   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:09.180892   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0113 19:45:09.265764   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-slave", UID:"12d783f8-da9c-4588-86af-d926365edec2", APIVersion:"v1", ResourceVersion:"1767", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-rz8pk
I0113 19:45:09.269180   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-slave", UID:"12d783f8-da9c-4588-86af-d926365edec2", APIVersion:"v1", ResourceVersion:"1767", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-6lkwp
replicationcontroller/redis-master scaled
I0113 19:45:09.400092   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-master", UID:"601ca89b-cbc4-4f51-ab21-c03afd6a1249", APIVersion:"v1", ResourceVersion:"1774", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-mgjqp
replicationcontroller/redis-slave scaled
... skipping 2 lines ...
I0113 19:45:09.408000   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-master", UID:"601ca89b-cbc4-4f51-ab21-c03afd6a1249", APIVersion:"v1", ResourceVersion:"1774", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-9sb6j
I0113 19:45:09.412419   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-slave", UID:"12d783f8-da9c-4588-86af-d926365edec2", APIVersion:"v1", ResourceVersion:"1776", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-vrz9q
core.sh:1123: Successful get rc redis-master {{.spec.replicas}}: 4
(Bcore.sh:1124: Successful get rc redis-slave {{.spec.replicas}}: 4
(Breplicationcontroller "redis-master" deleted
replicationcontroller "redis-slave" deleted
E0113 19:45:09.790659   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:09.915679   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 19:45:10.023332   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment", UID:"8317f618-a9e8-43c0-a87e-d147af7c858a", APIVersion:"apps/v1", ResourceVersion:"1808", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 19:45:10.028462   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"23a7aae4-f9c2-4231-8167-9fa41b158500", APIVersion:"apps/v1", ResourceVersion:"1809", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-p8kps
I0113 19:45:10.035384   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"23a7aae4-f9c2-4231-8167-9fa41b158500", APIVersion:"apps/v1", ResourceVersion:"1809", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-bb4ck
I0113 19:45:10.036311   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"23a7aae4-f9c2-4231-8167-9fa41b158500", APIVersion:"apps/v1", ResourceVersion:"1809", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-p5q4g
E0113 19:45:10.041218   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment scaled
I0113 19:45:10.160173   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment", UID:"8317f618-a9e8-43c0-a87e-d147af7c858a", APIVersion:"apps/v1", ResourceVersion:"1822", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6986c7bc94 to 1
I0113 19:45:10.169871   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"23a7aae4-f9c2-4231-8167-9fa41b158500", APIVersion:"apps/v1", ResourceVersion:"1823", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-p8kps
I0113 19:45:10.170615   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"23a7aae4-f9c2-4231-8167-9fa41b158500", APIVersion:"apps/v1", ResourceVersion:"1823", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-bb4ck
E0113 19:45:10.188116   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1133: Successful get deployment nginx-deployment {{.spec.replicas}}: 1
(Bdeployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
E0113 19:45:10.792167   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:10.917050   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 19:45:10.988624   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment", UID:"6e460258-d4be-445f-ae72-55520a367bb5", APIVersion:"apps/v1", ResourceVersion:"1849", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 19:45:10.992725   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"bab93f66-f745-49b7-84de-d1d73a0802b6", APIVersion:"apps/v1", ResourceVersion:"1850", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-bpczz
I0113 19:45:10.995769   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"bab93f66-f745-49b7-84de-d1d73a0802b6", APIVersion:"apps/v1", ResourceVersion:"1850", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-82fr2
I0113 19:45:10.995871   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-6986c7bc94", UID:"bab93f66-f745-49b7-84de-d1d73a0802b6", APIVersion:"apps/v1", ResourceVersion:"1850", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-hdw55
E0113 19:45:11.042390   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1152: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(BE0113 19:45:11.189302   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/nginx-deployment exposed
core.sh:1156: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(Bdeployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
replicationcontroller/frontend created
I0113 19:45:11.701046   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"8658ab8c-d2fa-45f6-beea-f11748c42acf", APIVersion:"v1", ResourceVersion:"1877", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-b2trp
I0113 19:45:11.704927   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"8658ab8c-d2fa-45f6-beea-f11748c42acf", APIVersion:"v1", ResourceVersion:"1877", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kx4g5
I0113 19:45:11.705662   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"8658ab8c-d2fa-45f6-beea-f11748c42acf", APIVersion:"v1", ResourceVersion:"1877", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bjqw5
E0113 19:45:11.793281   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1163: Successful get rc frontend {{.spec.replicas}}: 3
(BE0113 19:45:11.918233   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
E0113 19:45:12.043679   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1167: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
E0113 19:45:12.190227   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1171: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(Bpod/valid-pod created
service/frontend-3 exposed
E0113 19:45:12.794450   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1176: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(BE0113 19:45:12.919527   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-4 exposed
core.sh:1180: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(BE0113 19:45:13.044762   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-5 exposed
E0113 19:45:13.191663   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1184: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
E0113 19:45:13.795908   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
E0113 19:45:13.920687   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
E0113 19:45:14.045979   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/etcd-server exposed
has:etcd-server exposed
E0113 19:45:14.193610   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1214: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(Bcore.sh:1215: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(Bservice "etcd-server" deleted
core.sh:1221: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0113 19:45:14.797273   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
E0113 19:45:14.922139   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1225: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:15.047507   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1229: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:15.195054   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0113 19:45:15.375772   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"695ffd6e-c8b4-4ccc-8142-7505495b6e73", APIVersion:"v1", ResourceVersion:"1943", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4c556
I0113 19:45:15.378434   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"695ffd6e-c8b4-4ccc-8142-7505495b6e73", APIVersion:"v1", ResourceVersion:"1943", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9mjnr
I0113 19:45:15.379715   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"695ffd6e-c8b4-4ccc-8142-7505495b6e73", APIVersion:"v1", ResourceVersion:"1943", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gkn9r
replicationcontroller/redis-slave created
I0113 19:45:15.628351   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-slave", UID:"2347d5fd-68ff-433b-82f3-8ad6e5ef56de", APIVersion:"v1", ResourceVersion:"1953", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-kpknh
I0113 19:45:15.634425   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"redis-slave", UID:"2347d5fd-68ff-433b-82f3-8ad6e5ef56de", APIVersion:"v1", ResourceVersion:"1953", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-htnrf
core.sh:1234: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0113 19:45:15.798592   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1238: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0113 19:45:15.923586   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
E0113 19:45:16.048603   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1242: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:16.196507   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1246: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0113 19:45:16.523621   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"697e38fe-30d5-4f3b-a758-0ca3e2fa448f", APIVersion:"v1", ResourceVersion:"1972", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-znz4q
I0113 19:45:16.528091   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"697e38fe-30d5-4f3b-a758-0ca3e2fa448f", APIVersion:"v1", ResourceVersion:"1972", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-r8dkd
I0113 19:45:16.539463   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944704-29140", Name:"frontend", UID:"697e38fe-30d5-4f3b-a758-0ca3e2fa448f", APIVersion:"v1", ResourceVersion:"1972", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8lspm
core.sh:1249: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
E0113 19:45:16.799961   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1252: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(BE0113 19:45:16.924806   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E0113 19:45:17.049835   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0113 19:45:17.197740   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1256: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BapiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 24 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
E0113 19:45:17.801210   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
E0113 19:45:17.926055   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:18.051320   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources created
I0113 19:45:18.110455   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources", UID:"965a99e6-61c8-4474-8a0b-815dd2e0d33f", APIVersion:"apps/v1", ResourceVersion:"1994", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I0113 19:45:18.115358   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-67f8cfff5", UID:"db3be737-caab-4b3e-a439-585823b4b97f", APIVersion:"apps/v1", ResourceVersion:"1995", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-m8rjz
I0113 19:45:18.119421   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-67f8cfff5", UID:"db3be737-caab-4b3e-a439-585823b4b97f", APIVersion:"apps/v1", ResourceVersion:"1995", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-6lsnv
I0113 19:45:18.121226   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-67f8cfff5", UID:"db3be737-caab-4b3e-a439-585823b4b97f", APIVersion:"apps/v1", ResourceVersion:"1995", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-9s9sx
E0113 19:45:18.199270   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1271: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1272: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0113 19:45:18.655255   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources", UID:"965a99e6-61c8-4474-8a0b-815dd2e0d33f", APIVersion:"apps/v1", ResourceVersion:"2008", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I0113 19:45:18.659610   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-55c547f795", UID:"513c8edf-b982-48e9-869a-342530e95d7f", APIVersion:"apps/v1", ResourceVersion:"2009", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-k9p7l
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(BE0113 19:45:18.802687   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(BE0113 19:45:18.927488   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find container named redis
E0113 19:45:19.052527   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
E0113 19:45:19.200964   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:45:19.217388   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources", UID:"965a99e6-61c8-4474-8a0b-815dd2e0d33f", APIVersion:"apps/v1", ResourceVersion:"2020", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-55c547f795 to 0
I0113 19:45:19.226729   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources", UID:"965a99e6-61c8-4474-8a0b-815dd2e0d33f", APIVersion:"apps/v1", ResourceVersion:"2023", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I0113 19:45:19.236141   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-6d86564b45", UID:"fb42e50d-bd41-417f-9ea4-d4755ab47d39", APIVersion:"apps/v1", ResourceVersion:"2027", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-q6khr
I0113 19:45:19.236305   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-55c547f795", UID:"513c8edf-b982-48e9-869a-342530e95d7f", APIVersion:"apps/v1", ResourceVersion:"2024", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-55c547f795-k9p7l
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0113 19:45:19.666840   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources", UID:"965a99e6-61c8-4474-8a0b-815dd2e0d33f", APIVersion:"apps/v1", ResourceVersion:"2040", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
I0113 19:45:19.685238   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources", UID:"965a99e6-61c8-4474-8a0b-815dd2e0d33f", APIVersion:"apps/v1", ResourceVersion:"2042", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
I0113 19:45:19.687390   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-67f8cfff5", UID:"db3be737-caab-4b3e-a439-585823b4b97f", APIVersion:"apps/v1", ResourceVersion:"2044", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-m8rjz
I0113 19:45:19.704117   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944704-29140", Name:"nginx-deployment-resources-6c478d4fdb", UID:"3d6bcf06-803e-4d2a-adc4-b00aab4ac11a", APIVersion:"apps/v1", ResourceVersion:"2047", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-mq9zl
E0113 19:45:19.804023   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0113 19:45:19.928661   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE0113 19:45:20.053700   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE0113 19:45:20.202456   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: apps/v1
kind: Deployment
metadata:
  annotations:
    deployment.kubernetes.io/revision: "4"
  creationTimestamp: "2020-01-13T19:45:18Z"
... skipping 65 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1292: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1293: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1294: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE0113 19:45:20.805590   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment-resources" deleted
E0113 19:45:20.929765   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_deployment_tests
Running command: run_deployment_tests

+++ Running case: test-cmd.run_deployment_tests 
E0113 19:45:21.056270   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_deployment_tests
+++ [0113 19:45:21] Creating namespace namespace-1578944721-5514
namespace/namespace-1578944721-5514 created
E0113 19:45:21.203847   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:45:21] Testing deployments
deployment.apps/test-nginx-extensions created
I0113 19:45:21.385188   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"test-nginx-extensions", UID:"7d53f03c-b716-473b-a17e-de4d44bce22d", APIVersion:"apps/v1", ResourceVersion:"2077", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
I0113 19:45:21.395935   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"test-nginx-extensions-5559c76db7", UID:"e7dbe8eb-617b-443b-a875-29e2f2e92b9a", APIVersion:"apps/v1", ResourceVersion:"2078", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-b9wvc
apps.sh:185: Successful get deploy test-nginx-extensions {{(index .spec.template.spec.containers 0).name}}: nginx
(BSuccessful
message:10
has not:2
Successful
message:apps/v1
has:apps/v1
E0113 19:45:21.807771   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-nginx-extensions" deleted
E0113 19:45:21.931021   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-nginx-apps created
I0113 19:45:21.971997   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"test-nginx-apps", UID:"6cbc0c70-de2a-41b5-bf53-336fd6cdfa6f", APIVersion:"apps/v1", ResourceVersion:"2091", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
I0113 19:45:21.980936   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"test-nginx-apps-79b9bd9585", UID:"141c3dc2-8086-40aa-982d-288d096993ff", APIVersion:"apps/v1", ResourceVersion:"2092", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-xx9ml
E0113 19:45:22.057682   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:198: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0113 19:45:22.205230   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has:10
Successful
message:apps/v1
has:apps/v1
... skipping 13 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 34 lines ...
Volumes:          <none>
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
(Bdeployment.apps "test-nginx-apps" deleted
E0113 19:45:22.808785   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:22.932289   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-with-command created
I0113 19:45:22.968915   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-with-command", UID:"60a8ef7a-334b-4b09-a53b-18179c7490e1", APIVersion:"apps/v1", ResourceVersion:"2107", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I0113 19:45:22.974083   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-with-command-757c6f58dd", UID:"029884ce-f527-46fb-9609-130745d7dec2", APIVersion:"apps/v1", ResourceVersion:"2108", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-qpwjv
E0113 19:45:23.059193   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(Bdeployment.apps "nginx-with-command" deleted
E0113 19:45:23.206185   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/deployment-with-unixuserid created
I0113 19:45:23.564809   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"deployment-with-unixuserid", UID:"fd400660-496c-473c-9ad2-c9f15541c4c1", APIVersion:"apps/v1", ResourceVersion:"2121", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I0113 19:45:23.568485   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"3854a732-3c31-4a5c-b14e-4f9e57fe7fc1", APIVersion:"apps/v1", ResourceVersion:"2122", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-4rlkn
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
E0113 19:45:23.809757   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:23.933564   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:24.060445   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 19:45:24.155300   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"d497383b-b8fd-4c3b-83f2-1c643fcb6d94", APIVersion:"apps/v1", ResourceVersion:"2135", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 19:45:24.163924   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6986c7bc94", UID:"934c2f88-b1d8-43fa-8e89-ca4654194aa9", APIVersion:"apps/v1", ResourceVersion:"2136", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-5t7js
I0113 19:45:24.172632   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6986c7bc94", UID:"934c2f88-b1d8-43fa-8e89-ca4654194aa9", APIVersion:"apps/v1", ResourceVersion:"2136", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-7bsn2
I0113 19:45:24.173057   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6986c7bc94", UID:"934c2f88-b1d8-43fa-8e89-ca4654194aa9", APIVersion:"apps/v1", ResourceVersion:"2136", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-2r6db
E0113 19:45:24.207510   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:24.810740   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0113 19:45:24.920355   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"f0dd856b-2238-4102-ac6d-74abdf972325", APIVersion:"apps/v1", ResourceVersion:"2159", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I0113 19:45:24.925018   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-7f6fc565b9", UID:"78bd28c2-a329-4e04-b609-a797d6aa72ef", APIVersion:"apps/v1", ResourceVersion:"2160", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-dxs5k
E0113 19:45:24.934488   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(BE0113 19:45:25.061918   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0113 19:45:25.208797   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Breplicaset.apps "nginx-deployment-7f6fc565b9" deleted
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:25.812248   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:25.935953   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 19:45:25.974622   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"ad2f3408-01f7-43bd-982d-0e0ed7da01b3", APIVersion:"apps/v1", ResourceVersion:"2178", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0113 19:45:25.977461   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6986c7bc94", UID:"9487ecc4-9694-4d37-840e-b3413699d66a", APIVersion:"apps/v1", ResourceVersion:"2179", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-d9dtw
I0113 19:45:25.982187   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6986c7bc94", UID:"9487ecc4-9694-4d37-840e-b3413699d66a", APIVersion:"apps/v1", ResourceVersion:"2179", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-lg7rb
I0113 19:45:25.982649   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6986c7bc94", UID:"9487ecc4-9694-4d37-840e-b3413699d66a", APIVersion:"apps/v1", ResourceVersion:"2179", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-l9cfz
E0113 19:45:26.063338   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bhorizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
E0113 19:45:26.209788   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
deployment.apps "nginx-deployment" deleted
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:26.813426   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I0113 19:45:26.862656   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx", UID:"d91aca75-6ec2-42cf-b269-2b118b4a50b1", APIVersion:"apps/v1", ResourceVersion:"2204", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0113 19:45:26.875843   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-f87d999f7", UID:"7d9b9ad9-372f-47fe-992f-fb732542a558", APIVersion:"apps/v1", ResourceVersion:"2205", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-nzk28
I0113 19:45:26.881034   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-f87d999f7", UID:"7d9b9ad9-372f-47fe-992f-fb732542a558", APIVersion:"apps/v1", ResourceVersion:"2205", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-kkp66
I0113 19:45:26.881085   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-f87d999f7", UID:"7d9b9ad9-372f-47fe-992f-fb732542a558", APIVersion:"apps/v1", ResourceVersion:"2205", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-8dxjg
E0113 19:45:26.937232   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0113 19:45:27.064618   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 19:45:27.211097   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BWarning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I0113 19:45:27.594233   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx", UID:"d91aca75-6ec2-42cf-b269-2b118b4a50b1", APIVersion:"apps/v1", ResourceVersion:"2218", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I0113 19:45:27.597911   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-78487f9fd7", UID:"0a575ead-2cc3-4c1f-8799-9ac4aad788b5", APIVersion:"apps/v1", ResourceVersion:"2219", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-jg44r
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0113 19:45:27.814709   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    Image:	k8s.gcr.io/nginx:test-cmd
E0113 19:45:27.938385   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0113 19:45:28.066190   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
E0113 19:45:28.212445   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:28.816014   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:28.943345   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:29.067735   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:29.213889   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
E0113 19:45:29.817283   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:29.944508   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:30.068905   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:30.215523   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0113 19:45:30.819594   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx paused
E0113 19:45:30.945743   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
E0113 19:45:31.070272   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
E0113 19:45:31.216750   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
I0113 19:45:31.755970   54883 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578944704-29140
E0113 19:45:31.820746   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
E0113 19:45:31.946990   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:45:31.948922   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx", UID:"d91aca75-6ec2-42cf-b269-2b118b4a50b1", APIVersion:"apps/v1", ResourceVersion:"2250", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-f87d999f7 to 2
I0113 19:45:31.955649   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-f87d999f7", UID:"7d9b9ad9-372f-47fe-992f-fb732542a558", APIVersion:"apps/v1", ResourceVersion:"2254", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-f87d999f7-kkp66
I0113 19:45:31.957662   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx", UID:"d91aca75-6ec2-42cf-b269-2b118b4a50b1", APIVersion:"apps/v1", ResourceVersion:"2252", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7b8c65788b to 1
I0113 19:45:31.962712   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-7b8c65788b", UID:"f2349759-44d2-400a-942e-92b95f6fabe6", APIVersion:"apps/v1", ResourceVersion:"2258", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7b8c65788b-wzkvf
E0113 19:45:32.071643   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:32.218008   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:32.821984   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:32.948283   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:33.072764   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 116 lines ...
      terminationGracePeriodSeconds: 30
status:
  fullyLabeledReplicas: 1
  observedGeneration: 2
  replicas: 1
has:deployment.kubernetes.io/revision: "6"
E0113 19:45:33.219391   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx2 created
I0113 19:45:33.432772   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx2", UID:"6dc8d46a-a534-4cd6-ac58-ba2e8902131e", APIVersion:"apps/v1", ResourceVersion:"2273", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
I0113 19:45:33.436570   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx2-57b7865cd9", UID:"c365493b-4196-46b7-b63a-e0b5562b814f", APIVersion:"apps/v1", ResourceVersion:"2274", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-w9jbm
I0113 19:45:33.444296   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx2-57b7865cd9", UID:"c365493b-4196-46b7-b63a-e0b5562b814f", APIVersion:"apps/v1", ResourceVersion:"2274", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-8hssl
I0113 19:45:33.444883   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx2-57b7865cd9", UID:"c365493b-4196-46b7-b63a-e0b5562b814f", APIVersion:"apps/v1", ResourceVersion:"2274", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-gtnbx
deployment.apps "nginx2" deleted
deployment.apps "nginx" deleted
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:33.823116   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:33.949503   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 19:45:34.023350   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"956385dd-1572-4d56-96a8-f7c594d70997", APIVersion:"apps/v1", ResourceVersion:"2307", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0113 19:45:34.026252   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"68813f9e-4c6d-4591-9bf3-730f62b34052", APIVersion:"apps/v1", ResourceVersion:"2308", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-pfn2m
I0113 19:45:34.030443   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"68813f9e-4c6d-4591-9bf3-730f62b34052", APIVersion:"apps/v1", ResourceVersion:"2308", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-r2v8f
I0113 19:45:34.030798   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"68813f9e-4c6d-4591-9bf3-730f62b34052", APIVersion:"apps/v1", ResourceVersion:"2308", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-2vmvs
E0113 19:45:34.073920   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0113 19:45:34.220667   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0113 19:45:34.562768   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"956385dd-1572-4d56-96a8-f7c594d70997", APIVersion:"apps/v1", ResourceVersion:"2321", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I0113 19:45:34.565801   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-59df9b5f5b", UID:"f1303742-0eb3-49f5-9b9f-6f4af0a300b4", APIVersion:"apps/v1", ResourceVersion:"2322", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-rgf2c
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0113 19:45:34.824260   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find container named "redis"
E0113 19:45:34.950750   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
E0113 19:45:35.075195   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 19:45:35.221939   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0113 19:45:35.825431   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0113 19:45:35.952106   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0113 19:45:36.076399   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
I0113 19:45:36.155638   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"956385dd-1572-4d56-96a8-f7c594d70997", APIVersion:"apps/v1", ResourceVersion:"2341", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0113 19:45:36.162535   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"68813f9e-4c6d-4591-9bf3-730f62b34052", APIVersion:"apps/v1", ResourceVersion:"2345", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-2vmvs
I0113 19:45:36.165722   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"956385dd-1572-4d56-96a8-f7c594d70997", APIVersion:"apps/v1", ResourceVersion:"2343", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I0113 19:45:36.171452   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-7d758dbc54", UID:"b27f560e-28ac-4bce-89da-d029c46e1e6f", APIVersion:"apps/v1", ResourceVersion:"2350", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-fdchh
E0113 19:45:36.223426   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0113 19:45:36.826694   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0113 19:45:36.953378   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:37.078362   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:37.224956   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0113 19:45:37.248251   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2376", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0113 19:45:37.252039   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"9fc3535b-ef70-48bd-9840-7853428600fa", APIVersion:"apps/v1", ResourceVersion:"2377", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-zphxb
I0113 19:45:37.257290   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"9fc3535b-ef70-48bd-9840-7853428600fa", APIVersion:"apps/v1", ResourceVersion:"2377", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-wth79
I0113 19:45:37.257337   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"9fc3535b-ef70-48bd-9840-7853428600fa", APIVersion:"apps/v1", ResourceVersion:"2377", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-nq7q8
configmap/test-set-env-config created
secret/test-set-env-secret created
E0113 19:45:37.828203   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0113 19:45:37.954618   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(BE0113 19:45:38.079786   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(Bdeployment.apps/nginx-deployment env updated
I0113 19:45:38.212505   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2392", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I0113 19:45:38.215987   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6b9f7756b4", UID:"c3695a22-56c2-4c2a-a06d-8f3df6632d9b", APIVersion:"apps/v1", ResourceVersion:"2393", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-6f84q
E0113 19:45:38.226080   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
(Bapps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
(Bdeployment.apps/nginx-deployment env updated
I0113 19:45:38.590659   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2401", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0113 19:45:38.598329   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"9fc3535b-ef70-48bd-9840-7853428600fa", APIVersion:"apps/v1", ResourceVersion:"2405", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-zphxb
I0113 19:45:38.604858   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2404", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-754bf964c8 to 1
I0113 19:45:38.610441   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-754bf964c8", UID:"d8e8cb7f-d394-4be6-9890-d42d9f58fe2d", APIVersion:"apps/v1", ResourceVersion:"2411", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-754bf964c8-9f75t
apps.sh:389: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
(Bdeployment.apps/nginx-deployment env updated
E0113 19:45:38.829124   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:45:38.830468   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2421", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 1
I0113 19:45:38.846886   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"9fc3535b-ef70-48bd-9840-7853428600fa", APIVersion:"apps/v1", ResourceVersion:"2425", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-wth79
I0113 19:45:38.851085   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2423", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-c6d5c5c7b to 1
I0113 19:45:38.857694   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-c6d5c5c7b", UID:"80d11466-2795-4e1b-82f6-c14080856c42", APIVersion:"apps/v1", ResourceVersion:"2429", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-c6d5c5c7b-jkrsh
E0113 19:45:38.956044   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0113 19:45:38.977005   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2443", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 0
I0113 19:45:38.989555   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-598d4d68b4", UID:"9fc3535b-ef70-48bd-9840-7853428600fa", APIVersion:"apps/v1", ResourceVersion:"2447", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-nq7q8
I0113 19:45:38.992896   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2445", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
I0113 19:45:38.998546   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-5958f7687", UID:"76bace96-2c42-4ebe-b49d-516b9556a095", APIVersion:"apps/v1", ResourceVersion:"2451", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-qh2v9
E0113 19:45:39.081430   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0113 19:45:39.136097   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2461", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-5958f7687 to 0
I0113 19:45:39.154242   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2464", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-98b7fd455 to 1
E0113 19:45:39.177595   54883 replica_set.go:534] sync "namespace-1578944721-5514/nginx-deployment-5958f7687" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-5958f7687": the object has been modified; please apply your changes to the latest version and try again
E0113 19:45:39.229333   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0113 19:45:39.279155   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-5958f7687", UID:"76bace96-2c42-4ebe-b49d-516b9556a095", APIVersion:"apps/v1", ResourceVersion:"2465", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-5958f7687-qh2v9
deployment.apps/nginx-deployment env updated
I0113 19:45:39.422201   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2470", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6b9f7756b4 to 0
deployment.apps "nginx-deployment" deleted
I0113 19:45:39.572785   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment", UID:"c075052b-e5ae-4afb-ae1a-5a7c0f94c5e0", APIVersion:"apps/v1", ResourceVersion:"2476", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-868b664cb5 to 1
I0113 19:45:39.574675   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-6b9f7756b4", UID:"c3695a22-56c2-4c2a-a06d-8f3df6632d9b", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6b9f7756b4-6f84q
configmap "test-set-env-config" deleted
I0113 19:45:39.676953   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-98b7fd455", UID:"ec91d816-33b9-4ded-b839-a7ea6bd40051", APIVersion:"apps/v1", ResourceVersion:"2467", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-98b7fd455-mm654
I0113 19:45:39.726683   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944721-5514", Name:"nginx-deployment-868b664cb5", UID:"44f65652-cf95-4031-909d-15fb138e0d8b", APIVersion:"apps/v1", ResourceVersion:"2496", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-868b664cb5-k6vng
secret "test-set-env-secret" deleted
+++ exit code: 0
E0113 19:45:39.830287   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:39.875101   54883 replica_set.go:534] sync "namespace-1578944721-5514/nginx-deployment-5958f7687" failed with replicasets.apps "nginx-deployment-5958f7687" not found
Recording: run_rs_tests
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
E0113 19:45:39.957226   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 19:45:39] Creating namespace namespace-1578944739-17454
E0113 19:45:40.025017   54883 replica_set.go:534] sync "namespace-1578944721-5514/nginx-deployment-6b9f7756b4" failed with replicasets.apps "nginx-deployment-6b9f7756b4" not found
namespace/namespace-1578944739-17454 created
E0113 19:45:40.082646   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:40.125080   54883 replica_set.go:534] sync "namespace-1578944721-5514/nginx-deployment-98b7fd455" failed with replicasets.apps "nginx-deployment-98b7fd455" not found
Context "test" modified.
E0113 19:45:40.176569   54883 replica_set.go:534] sync "namespace-1578944721-5514/nginx-deployment-868b664cb5" failed with replicasets.apps "nginx-deployment-868b664cb5" not found
+++ [0113 19:45:40] Testing kubectl(v1:replicasets)
E0113 19:45:40.230728   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0113 19:45:40.529808   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"70471555-2a08-4259-8efc-180c80ecfe6f", APIVersion:"apps/v1", ResourceVersion:"2512", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tghfz
I0113 19:45:40.534056   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"70471555-2a08-4259-8efc-180c80ecfe6f", APIVersion:"apps/v1", ResourceVersion:"2512", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4lt66
I0113 19:45:40.534120   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"70471555-2a08-4259-8efc-180c80ecfe6f", APIVersion:"apps/v1", ResourceVersion:"2512", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rzrxc
+++ [0113 19:45:40] Deleting rs
replicaset.apps "frontend" deleted
E0113 19:45:40.674774   54883 replica_set.go:534] sync "namespace-1578944739-17454/frontend" failed with replicasets.apps "frontend" not found
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:40.831820   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:40.958443   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:41.084202   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0113 19:45:41.143907   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"9537ba9f-a62b-4b83-aa2b-76bb35b41fb9", APIVersion:"apps/v1", ResourceVersion:"2529", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-r6zcb
I0113 19:45:41.147662   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"9537ba9f-a62b-4b83-aa2b-76bb35b41fb9", APIVersion:"apps/v1", ResourceVersion:"2529", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-69422
I0113 19:45:41.152393   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"9537ba9f-a62b-4b83-aa2b-76bb35b41fb9", APIVersion:"apps/v1", ResourceVersion:"2529", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qhswl
I0113 19:45:41.205821   54883 horizontal.go:353] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1578944721-5514
E0113 19:45:41.232004   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:525: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0113 19:45:41] Deleting rs
replicaset.apps "frontend" deleted
E0113 19:45:41.474590   54883 replica_set.go:534] sync "namespace-1578944739-17454/frontend" failed with replicasets.apps "frontend" not found
apps.sh:529: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:531: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-69422" deleted
pod "frontend-qhswl" deleted
pod "frontend-r6zcb" deleted
E0113 19:45:41.833068   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:534: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:41.959751   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:538: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:42.085456   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:42.233502   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0113 19:45:42.273660   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"15e0be85-14f6-4574-b700-7c60700dadf9", APIVersion:"apps/v1", ResourceVersion:"2550", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ndpbz
I0113 19:45:42.277932   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"15e0be85-14f6-4574-b700-7c60700dadf9", APIVersion:"apps/v1", ResourceVersion:"2550", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wnd5d
I0113 19:45:42.279904   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"15e0be85-14f6-4574-b700-7c60700dadf9", APIVersion:"apps/v1", ResourceVersion:"2550", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ddshs
apps.sh:542: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bmatched Name:
... skipping 8 lines ...
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 10 lines ...
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-ndpbz
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-wnd5d
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-ddshs
(B
E0113 19:45:42.834213   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:548: Successful describe
Name:         frontend
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 4 lines ...
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(B
E0113 19:45:42.960966   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:550: Successful describe
Name:         frontend
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 10 lines ...
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-ndpbz
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-wnd5d
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-ddshs
(B
E0113 19:45:43.086798   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 3 lines ...
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-ndpbz
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-wnd5d
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-ddshs
(BE0113 19:45:43.235709   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
Namespace:    namespace-1578944739-17454
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 102 lines ...
Volumes:               <none>
QoS Class:             Burstable
Node-Selectors:        <none>
Tolerations:           <none>
Events:                <none>
(Bapps.sh:564: Successful get rs frontend {{.spec.replicas}}: 3
(BE0113 19:45:43.835583   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend scaled
E0113 19:45:43.925388   54883 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578944739-17454 /apis/apps/v1/namespaces/namespace-1578944739-17454/replicasets/frontend 15e0be85-14f6-4574-b700-7c60700dadf9 2561 2 2020-01-13 19:45:42 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0033d7598 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0113 19:45:43.933825   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"15e0be85-14f6-4574-b700-7c60700dadf9", APIVersion:"apps/v1", ResourceVersion:"2561", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-ddshs
E0113 19:45:43.962195   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:568: Successful get rs frontend {{.spec.replicas}}: 2
(BE0113 19:45:44.088281   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:44.236934   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 created
I0113 19:45:44.320330   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944739-17454", Name:"scale-1", UID:"27b022c3-6359-4922-af26-0320acb586b9", APIVersion:"apps/v1", ResourceVersion:"2567", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I0113 19:45:44.325838   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-1-5c5565bcd9", UID:"afc5f16e-9b9b-43ee-8cf3-436a5373259b", APIVersion:"apps/v1", ResourceVersion:"2568", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-ffzpf
deployment.apps/scale-2 created
I0113 19:45:44.557267   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944739-17454", Name:"scale-2", UID:"23392b0f-e58d-4f82-8368-fc78acd83501", APIVersion:"apps/v1", ResourceVersion:"2577", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I0113 19:45:44.561894   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-2-5c5565bcd9", UID:"6bdb4a5c-2f1c-40ec-a1d5-123007b3ef4e", APIVersion:"apps/v1", ResourceVersion:"2578", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-bp2lp
deployment.apps/scale-3 created
I0113 19:45:44.780026   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944739-17454", Name:"scale-3", UID:"1d085c81-c518-40df-98db-2560033be6b7", APIVersion:"apps/v1", ResourceVersion:"2587", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I0113 19:45:44.783745   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-3-5c5565bcd9", UID:"0860815b-f3da-496b-9b21-96df44f32ae2", APIVersion:"apps/v1", ResourceVersion:"2588", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-ttpjg
E0113 19:45:44.836871   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:574: Successful get deploy scale-1 {{.spec.replicas}}: 1
(BE0113 19:45:44.963704   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:575: Successful get deploy scale-2 {{.spec.replicas}}: 1
(BE0113 19:45:45.089555   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:576: Successful get deploy scale-3 {{.spec.replicas}}: 1
(BE0113 19:45:45.238135   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 scaled
I0113 19:45:45.278579   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944739-17454", Name:"scale-1", UID:"27b022c3-6359-4922-af26-0320acb586b9", APIVersion:"apps/v1", ResourceVersion:"2599", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
deployment.apps/scale-2 scaled
I0113 19:45:45.282242   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-1-5c5565bcd9", UID:"afc5f16e-9b9b-43ee-8cf3-436a5373259b", APIVersion:"apps/v1", ResourceVersion:"2600", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-jtf2j
I0113 19:45:45.285617   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944739-17454", Name:"scale-2", UID:"23392b0f-e58d-4f82-8368-fc78acd83501", APIVersion:"apps/v1", ResourceVersion:"2601", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
I0113 19:45:45.301111   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-2-5c5565bcd9", UID:"6bdb4a5c-2f1c-40ec-a1d5-123007b3ef4e", APIVersion:"apps/v1", ResourceVersion:"2605", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-2cbsz
... skipping 6 lines ...
I0113 19:45:45.812855   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944739-17454", Name:"scale-2", UID:"23392b0f-e58d-4f82-8368-fc78acd83501", APIVersion:"apps/v1", ResourceVersion:"2620", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 3
I0113 19:45:45.817303   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-1-5c5565bcd9", UID:"afc5f16e-9b9b-43ee-8cf3-436a5373259b", APIVersion:"apps/v1", ResourceVersion:"2621", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-d9z6v
I0113 19:45:45.817395   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-2-5c5565bcd9", UID:"6bdb4a5c-2f1c-40ec-a1d5-123007b3ef4e", APIVersion:"apps/v1", ResourceVersion:"2622", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-nr9sd
deployment.apps/scale-3 scaled
I0113 19:45:45.824830   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944739-17454", Name:"scale-3", UID:"1d085c81-c518-40df-98db-2560033be6b7", APIVersion:"apps/v1", ResourceVersion:"2625", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 3
I0113 19:45:45.834172   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-3-5c5565bcd9", UID:"0860815b-f3da-496b-9b21-96df44f32ae2", APIVersion:"apps/v1", ResourceVersion:"2632", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-57l2k
E0113 19:45:45.838223   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:45:45.840888   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"scale-3-5c5565bcd9", UID:"0860815b-f3da-496b-9b21-96df44f32ae2", APIVersion:"apps/v1", ResourceVersion:"2632", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-llmv5
apps.sh:584: Successful get deploy scale-1 {{.spec.replicas}}: 3
(BE0113 19:45:45.964969   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:585: Successful get deploy scale-2 {{.spec.replicas}}: 3
(BE0113 19:45:46.090757   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:586: Successful get deploy scale-3 {{.spec.replicas}}: 3
(BE0113 19:45:46.239751   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
replicaset.apps/frontend created
I0113 19:45:46.701443   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"c1e97039-5f04-49a5-b601-14b0675d32d7", APIVersion:"apps/v1", ResourceVersion:"2681", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nvjgj
I0113 19:45:46.705185   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"c1e97039-5f04-49a5-b601-14b0675d32d7", APIVersion:"apps/v1", ResourceVersion:"2681", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xlq2x
I0113 19:45:46.707535   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"c1e97039-5f04-49a5-b601-14b0675d32d7", APIVersion:"apps/v1", ResourceVersion:"2681", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hzlvv
apps.sh:594: Successful get rs frontend {{.spec.replicas}}: 3
(BE0113 19:45:46.839553   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
E0113 19:45:46.966085   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:598: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BE0113 19:45:47.092593   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-2 exposed
E0113 19:45:47.240967   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:602: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice "frontend" deleted
service "frontend-2" deleted
apps.sh:608: Successful get rs frontend {{.metadata.generation}}: 1
(Breplicaset.apps/frontend image updated
E0113 19:45:47.841588   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 2
(BE0113 19:45:47.967453   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend env updated
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 3
(BE0113 19:45:48.093989   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend resource requirements updated
E0113 19:45:48.242188   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 4
(Bapps.sh:618: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
apps.sh:622: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:626: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:48.843056   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:48.968679   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0113 19:45:49.042414   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"59f639da-e94c-4631-842a-64438e679360", APIVersion:"apps/v1", ResourceVersion:"2720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9c4bz
I0113 19:45:49.046217   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"59f639da-e94c-4631-842a-64438e679360", APIVersion:"apps/v1", ResourceVersion:"2720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mhgvj
I0113 19:45:49.046265   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"59f639da-e94c-4631-842a-64438e679360", APIVersion:"apps/v1", ResourceVersion:"2720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bgln2
E0113 19:45:49.095367   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:49.243753   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/redis-slave created
I0113 19:45:49.296430   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"redis-slave", UID:"cf4a072b-eb6b-4fc6-b4cb-01ab74f5b475", APIVersion:"apps/v1", ResourceVersion:"2729", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-v6tb5
I0113 19:45:49.305262   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"redis-slave", UID:"cf4a072b-eb6b-4fc6-b4cb-01ab74f5b475", APIVersion:"apps/v1", ResourceVersion:"2729", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-vrtrq
apps.sh:631: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bapps.sh:635: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicaset.apps "frontend" deleted
replicaset.apps "redis-slave" deleted
E0113 19:45:49.844644   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:639: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:49.971431   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:644: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:50.096524   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
E0113 19:45:50.244722   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:45:50.247601   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"4eb5b406-7428-4b62-9ce6-5cd7d3f228f3", APIVersion:"apps/v1", ResourceVersion:"2748", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8xpm2
I0113 19:45:50.256995   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"4eb5b406-7428-4b62-9ce6-5cd7d3f228f3", APIVersion:"apps/v1", ResourceVersion:"2748", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9gw2d
I0113 19:45:50.257964   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944739-17454", Name:"frontend", UID:"4eb5b406-7428-4b62-9ce6-5cd7d3f228f3", APIVersion:"apps/v1", ResourceVersion:"2748", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-cmlfb
apps.sh:647: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:650: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
E0113 19:45:50.846024   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0113 19:45:50.972592   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:654: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE0113 19:45:51.097869   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
E0113 19:45:51.246368   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
+++ [0113 19:45:51] Creating namespace namespace-1578944751-15271
namespace/namespace-1578944751-15271 created
Context "test" modified.
+++ [0113 19:45:51] Testing kubectl(v1:statefulsets)
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:51.847473   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:51.974024   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:45:52.092679   51369 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
E0113 19:45:52.099161   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(BE0113 19:45:52.247763   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(Bstatefulset.apps/nginx scaled
I0113 19:45:52.503695   54883 event.go:278] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1578944751-15271", Name:"nginx", UID:"6cfb9748-b6a9-4079-be08-4347fdb724b0", APIVersion:"apps/v1", ResourceVersion:"2775", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(BE0113 19:45:52.848635   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:52.975431   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx restarted
E0113 19:45:53.100659   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(Bstatefulset.apps "nginx" deleted
I0113 19:45:53.211752   54883 stateful_set.go:420] StatefulSet has been deleted namespace-1578944751-15271/nginx
E0113 19:45:53.249135   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_statefulset_history_tests
Running command: run_statefulset_history_tests

+++ Running case: test-cmd.run_statefulset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_statefulset_history_tests
+++ [0113 19:45:53] Creating namespace namespace-1578944753-8356
namespace/namespace-1578944753-8356 created
Context "test" modified.
+++ [0113 19:45:53] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:45:53.849888   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:45:53.976603   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx created
E0113 19:45:54.102113   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578944753-8356"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0113 19:45:54.250516   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx configured
E0113 19:45:54.851277   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE0113 19:45:54.978027   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0113 19:45:55.103464   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0113 19:45:55.252024   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578944753-8356"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578944753-8356"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bstatefulset.apps/nginx will roll back to Pod Template:
  Labels:	app=nginx-statefulset
  Containers:
... skipping 9 lines ...
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0113 19:45:55.852540   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx rolled back
E0113 19:45:55.979415   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0113 19:45:56.104753   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0113 19:45:56.253349   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE0113 19:45:56.853694   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0113 19:45:56.980594   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0113 19:45:57.105941   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0113 19:45:57.155086   54883 stateful_set.go:420] StatefulSet has been deleted namespace-1578944753-8356/nginx
E0113 19:45:57.254575   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_lists_tests
Running command: run_lists_tests

+++ Running case: test-cmd.run_lists_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 3 lines ...
Context "test" modified.
+++ [0113 19:45:57] Testing kubectl(v1:lists)
service/list-service-test created
deployment.apps/list-deployment-test created
I0113 19:45:57.822768   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944757-26971", Name:"list-deployment-test", UID:"246b3478-b3c6-4f92-9453-9ba031ec3de0", APIVersion:"apps/v1", ResourceVersion:"2816", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7cd8c5ff6d to 1
I0113 19:45:57.830537   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944757-26971", Name:"list-deployment-test-7cd8c5ff6d", UID:"d14586b2-e278-4553-ad79-9ba1855c559d", APIVersion:"apps/v1", ResourceVersion:"2817", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: list-deployment-test-7cd8c5ff6d-d8qh4
E0113 19:45:57.855273   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "list-service-test" deleted
deployment.apps "list-deployment-test" deleted
E0113 19:45:57.981847   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_multi_resources_tests
Running command: run_multi_resources_tests

+++ Running case: test-cmd.run_multi_resources_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_multi_resources_tests
+++ [0113 19:45:58] Creating namespace namespace-1578944758-13407
E0113 19:45:58.107430   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578944758-13407 created
E0113 19:45:58.256524   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0113 19:45:58] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I0113 19:45:58.779615   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"6ab8eac0-7926-4b6c-80fe-3626c2319980", APIVersion:"v1", ResourceVersion:"2838", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-pzf64
E0113 19:45:58.856452   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 19:45:58.983275   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 19:45:59.108870   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.96    <none>        99/TCP    1s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       1s
E0113 19:45:59.257788   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1578944758-13407
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1578944758-13407
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 7 lines ...
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-pzf64
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0113 19:45:59.721975   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"604faea2-7158-4fbb-bf4d-464053308a42", APIVersion:"v1", ResourceVersion:"2852", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-thf28
E0113 19:45:59.857882   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0113 19:45:59.984462   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0113 19:46:00.111309   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
E0113 19:46:00.259093   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0113 19:46:00.859488   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0113 19:46:00.985667   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0113 19:46:01.113651   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0113 19:46:01.260336   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:01.860862   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0113 19:46:01.954087   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"d8982345-7e56-4c9f-82c5-95de98c5917d", APIVersion:"v1", ResourceVersion:"2879", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-g6xb9
E0113 19:46:01.987319   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 19:46:02.115086   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 19:46:02.261642   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.47    <none>        99/TCP    1s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       1s
Name:              mock
... skipping 13 lines ...
Name:         mock
Namespace:    namespace-1578944758-13407
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 4 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-g6xb9
service "mock" deleted
replicationcontroller "mock" deleted
E0113 19:46:02.861847   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock replaced
replicationcontroller/mock replaced
I0113 19:46:02.884346   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"fa22c812-7e3a-4f5d-bbbd-d476e56ac5ae", APIVersion:"v1", ResourceVersion:"2893", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-b65wv
E0113 19:46:02.988634   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0113 19:46:03.116408   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0113 19:46:03.263107   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
E0113 19:46:03.863284   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0113 19:46:03.989668   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
E0113 19:46:04.117376   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock annotated
E0113 19:46:04.264977   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:04.864507   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0113 19:46:04.964095   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"15dbbe3d-d709-4475-858e-ead239597a0a", APIVersion:"v1", ResourceVersion:"2917", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-pfnr8
E0113 19:46:04.990792   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 19:46:05.118753   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0113 19:46:05.266252   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.241   <none>        99/TCP    1s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       1s
I0113 19:46:05.520610   54883 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578944739-17454
... skipping 14 lines ...
Name:         mock
Namespace:    namespace-1578944758-13407
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 6 lines ...
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-pfnr8
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
E0113 19:46:05.865264   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:46:05.867857   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"c1cb6f43-df1f-45b2-b85f-92110dbf1b3d", APIVersion:"v1", ResourceVersion:"2934", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-xszcw
E0113 19:46:05.992283   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0113 19:46:06.120234   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0113 19:46:06.267763   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0113 19:46:06.866441   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0113 19:46:06.993521   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0113 19:46:07.121434   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0113 19:46:07.268960   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:07.867848   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock created
I0113 19:46:07.917999   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"71d86c2a-b124-477a-b8c2-c4154906283f", APIVersion:"v1", ResourceVersion:"2955", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-9n77j
replicationcontroller/mock2 created
I0113 19:46:07.921839   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock2", UID:"ae7d6ca4-8dc3-488e-9e1c-e218863471a6", APIVersion:"v1", ResourceVersion:"2957", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-q4wh5
E0113 19:46:07.994752   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BE0113 19:46:08.122662   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    DESIRED   CURRENT   READY   AGE
mock    1         1         0       1s
mock2   1         1         0       1s
E0113 19:46:08.270261   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:         mock
Namespace:    namespace-1578944758-13407
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1578944758-13407
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 9 lines ...
replicationcontroller "mock2" deleted
replicationcontroller/mock replaced
replicationcontroller/mock2 replaced
I0113 19:46:08.622329   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"3a32ae8c-2282-4e12-91b3-34941398747f", APIVersion:"v1", ResourceVersion:"2971", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-9l7jh
I0113 19:46:08.624779   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock2", UID:"408f3a05-7fb7-474f-95ec-f7e6b72f0607", APIVersion:"v1", ResourceVersion:"2972", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-gcmlm
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0113 19:46:08.868930   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:104: Successful get rc mock2 {{.metadata.labels.status}}: replaced
(BE0113 19:46:08.996019   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:09.124324   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock edited
replicationcontroller/mock2 edited
E0113 19:46:09.271826   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(Breplicationcontroller/mock labeled
replicationcontroller/mock2 labeled
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(BE0113 19:46:09.870393   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock annotated
replicationcontroller/mock2 annotated
E0113 19:46:09.997225   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0113 19:46:10.125699   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(BE0113 19:46:10.273116   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:10.871720   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
service/mock2 created
E0113 19:46:10.998399   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BE0113 19:46:11.130573   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
mock    ClusterIP   10.0.0.224   <none>        99/TCP    1s
mock2   ClusterIP   10.0.0.24    <none>        99/TCP    1s
E0113 19:46:11.274388   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1578944758-13407
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 19 lines ...
Events:            <none>
service "mock" deleted
service "mock2" deleted
service/mock replaced
service/mock2 replaced
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0113 19:46:11.873012   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(BE0113 19:46:11.999829   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:12.133306   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
service/mock2 edited
E0113 19:46:12.275901   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(Bservice/mock labeled
service/mock2 labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(BE0113 19:46:12.874150   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
service/mock2 annotated
E0113 19:46:13.001086   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0113 19:46:13.134395   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(BE0113 19:46:13.277157   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
service "mock2" deleted
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:13.875551   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:14.002160   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:14.135778   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0113 19:46:14.191694   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944758-13407", Name:"mock", UID:"349065f6-f8a2-445d-b090-d9a8529b49db", APIVersion:"v1", ResourceVersion:"3036", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-fnc4j
E0113 19:46:14.278421   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bservice "mock" deleted
replicationcontroller "mock" deleted
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:14.876857   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volumes_tests
Running command: run_persistent_volumes_tests
E0113 19:46:15.003689   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_persistent_volumes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volumes_tests
+++ [0113 19:46:15] Creating namespace namespace-1578944775-11924
E0113 19:46:15.137138   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578944775-11924 created
Context "test" modified.
+++ [0113 19:46:15] Testing persistent volumes
E0113 19:46:15.279905   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(Bpersistentvolume "pv0001" deleted
E0113 19:46:15.878192   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:16.004917   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0002 created
E0113 19:46:16.047044   54883 pv_protection_controller.go:116] PV pv0002 failed with : Operation cannot be fulfilled on persistentvolumes "pv0002": the object has been modified; please apply your changes to the latest version and try again
E0113 19:46:16.138329   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(BE0113 19:46:16.281089   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0002" deleted
persistentvolume/pv0003 created
E0113 19:46:16.527350   54883 pv_protection_controller.go:116] PV pv0003 failed with : Operation cannot be fulfilled on persistentvolumes "pv0003": the object has been modified; please apply your changes to the latest version and try again
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(Bpersistentvolume "pv0003" deleted
E0113 19:46:16.879837   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:17.006540   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
E0113 19:46:17.139537   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BE0113 19:46:17.282386   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
... skipping 8 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volume_claims_tests
+++ [0113 19:46:17] Creating namespace namespace-1578944777-15737
namespace/namespace-1578944777-15737 created
Context "test" modified.
+++ [0113 19:46:17] Testing persistent volumes claims
E0113 19:46:17.881050   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:18.007919   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:18.140743   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-1 created
I0113 19:46:18.189962   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-1", UID:"1d080545-5c5b-4317-89da-ccad632416ee", APIVersion:"v1", ResourceVersion:"3076", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0113 19:46:18.192849   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-1", UID:"1d080545-5c5b-4317-89da-ccad632416ee", APIVersion:"v1", ResourceVersion:"3077", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0113 19:46:18.283868   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(Bpersistentvolumeclaim "myclaim-1" deleted
I0113 19:46:18.429091   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-1", UID:"1d080545-5c5b-4317-89da-ccad632416ee", APIVersion:"v1", ResourceVersion:"3080", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-2 created
I0113 19:46:18.649600   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-2", UID:"64b3b977-03c1-4901-a89e-dbd5120462e1", APIVersion:"v1", ResourceVersion:"3083", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0113 19:46:18.652639   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-2", UID:"64b3b977-03c1-4901-a89e-dbd5120462e1", APIVersion:"v1", ResourceVersion:"3085", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(BE0113 19:46:18.882270   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim "myclaim-2" deleted
I0113 19:46:18.885767   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-2", UID:"64b3b977-03c1-4901-a89e-dbd5120462e1", APIVersion:"v1", ResourceVersion:"3087", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0113 19:46:19.008987   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-3 created
I0113 19:46:19.129574   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-3", UID:"cca51aee-91c4-4129-87d2-752872ce74b9", APIVersion:"v1", ResourceVersion:"3090", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0113 19:46:19.135752   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-3", UID:"cca51aee-91c4-4129-87d2-752872ce74b9", APIVersion:"v1", ResourceVersion:"3092", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0113 19:46:19.141774   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:19.284949   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
(Bpersistentvolumeclaim "myclaim-3" deleted
I0113 19:46:19.400054   54883 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578944777-15737", Name:"myclaim-3", UID:"cca51aee-91c4-4129-87d2-752872ce74b9", APIVersion:"v1", ResourceVersion:"3096", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_storage_class_tests
Running command: run_storage_class_tests

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [0113 19:46:19] Testing storage class
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:19.883828   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:20.010599   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storageclass.storage.k8s.io/storage-class-name created
E0113 19:46:20.143072   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE0113 19:46:20.286290   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorageclass.storage.k8s.io "storage-class-name" deleted
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_nodes_tests
Running command: run_nodes_tests

+++ Running case: test-cmd.run_nodes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_nodes_tests
+++ [0113 19:46:20] Testing kubectl(v1:nodes)
E0113 19:46:20.885092   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1375: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0113 19:46:21.011839   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
... skipping 41 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0113 19:46:21.144392   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1379: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Mon, 13 Jan 2020 19:40:59 +0000
... skipping 35 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(B
E0113 19:46:21.287606   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1381: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Mon, 13 Jan 2020 19:40:59 +0000
... skipping 180 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0113 19:46:21.889236   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Mon, 13 Jan 2020 19:40:59 +0000
... skipping 33 lines ...
  (Total limits may be over 100 percent, i.e., overcommitted.)
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
(BE0113 19:46:22.013180   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Mon, 13 Jan 2020 19:40:59 +0000
... skipping 34 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0113 19:46:22.145636   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 19:46:22.288995   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
core.sh:1398: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(Bnode/127.0.0.1 patched
core.sh:1401: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Btokenreview.authentication.k8s.io/<unknown> created
E0113 19:46:22.890461   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests
E0113 19:46:23.014428   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_authorization_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_authorization_tests
+++ [0113 19:46:23] Testing authorization
E0113 19:46:23.146733   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
subjectaccessreview.authorization.k8s.io/<unknown> created
subjectaccessreview.authorization.k8s.io/<unknown> created
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   826  100   524  100   302   255k   147k --:--:-- --:--:-- --:--:--  403k
+++ [0113 19:46:23] "authorization.k8s.io/subjectaccessreviews" returns as expected: {
... skipping 16 lines ...
  },
  "status": {
    "allowed": true,
    "reason": "RBAC: allowed by ClusterRoleBinding \"super-group\" of ClusterRole \"admin\" to Group \"the-group\""
  }
}
E0113 19:46:23.290334   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   818  100   520  100   298   253k   145k --:--:-- --:--:-- --:--:--  399k
+++ [0113 19:46:23] "authorization.k8s.io/subjectaccessreviews" returns as expected: {
  "kind": "SubjectAccessReview",
  "apiVersion": "authorization.k8s.io/v1",
... skipping 21 lines ...
Successful
message:yes
has:yes
Successful
message:yes
has:yes
E0113 19:46:23.892069   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
E0113 19:46:24.015767   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
E0113 19:46:24.148440   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
E0113 19:46:24.291839   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
0
has:0
Successful
message:0
... skipping 6 lines ...
yes
has:Warning: the server doesn't have a resource type 'foo'
Successful
message:Warning: the server doesn't have a resource type 'foo'
yes
has not:Warning: resource 'foo' is not namespace scoped
E0113 19:46:24.893508   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning
E0113 19:46:25.016962   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: resource 'nodes' is not namespace scoped
yes
has:Warning: resource 'nodes' is not namespace scoped
E0113 19:46:25.149735   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning: resource 'nodes' is not namespace scoped
E0113 19:46:25.293292   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io/testing-CR reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled
	reconciliation required create
... skipping 8 lines ...
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BE0113 19:46:25.894926   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
E0113 19:46:26.018263   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
Recording: run_retrieve_multiple_tests
Running command: run_retrieve_multiple_tests
E0113 19:46:26.151240   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_retrieve_multiple_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_retrieve_multiple_tests
Context "test" modified.
+++ [0113 19:46:26] Testing kubectl(v1:multiget)
E0113 19:46:26.295148   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:242: Successful get nodes/127.0.0.1 service/kubernetes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:kubernetes:
(B+++ exit code: 0
Recording: run_resource_aliasing_tests
Running command: run_resource_aliasing_tests

+++ Running case: test-cmd.run_resource_aliasing_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_resource_aliasing_tests
+++ [0113 19:46:26] Creating namespace namespace-1578944786-8686
namespace/namespace-1578944786-8686 created
Context "test" modified.
+++ [0113 19:46:26] Testing resource aliasing
E0113 19:46:26.896301   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0113 19:46:26.973599   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944786-8686", Name:"cassandra", UID:"a8af2a00-f750-4d0a-bf23-405c5cd331d3", APIVersion:"v1", ResourceVersion:"3124", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-jt4zx
I0113 19:46:26.976788   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944786-8686", Name:"cassandra", UID:"a8af2a00-f750-4d0a-bf23-405c5cd331d3", APIVersion:"v1", ResourceVersion:"3124", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-xxxhw
E0113 19:46:27.019893   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:27.152666   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/cassandra created
E0113 19:46:27.296542   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Waiting for Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}} : expected: cassandra:cassandra:cassandra:cassandra::, got: cassandra:cassandra:cassandra:cassandra:

discovery.sh:91: FAIL!
Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}
  Expected: cassandra:cassandra:cassandra:cassandra::
  Got:      cassandra:cassandra:cassandra:cassandra:
(B
55 /home/prow/go/src/k8s.io/kubernetes/hack/lib/test.sh
(B
discovery.sh:92: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(Bpod "cassandra-jt4zx" deleted
I0113 19:46:27.679719   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944786-8686", Name:"cassandra", UID:"a8af2a00-f750-4d0a-bf23-405c5cd331d3", APIVersion:"v1", ResourceVersion:"3130", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-7pzpc
pod "cassandra-xxxhw" deleted
I0113 19:46:27.692880   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944786-8686", Name:"cassandra", UID:"a8af2a00-f750-4d0a-bf23-405c5cd331d3", APIVersion:"v1", ResourceVersion:"3140", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-zzbzb
replicationcontroller "cassandra" deleted
E0113 19:46:27.704497   54883 replica_set.go:534] sync "namespace-1578944786-8686/cassandra" failed with replicationcontrollers "cassandra" not found
service "cassandra" deleted
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_explain_tests
+++ [0113 19:46:27] Testing kubectl(v1:explain)
E0113 19:46:27.897500   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:28.021149   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 21 lines ...

   status	<Object>
     Most recently observed status of the pod. This data may not be up to date.
     Populated by the system. Read-only. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E0113 19:46:28.153760   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 21 lines ...

   status	<Object>
     Most recently observed status of the pod. This data may not be up to date.
     Populated by the system. Read-only. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E0113 19:46:28.297909   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

FIELD:    message <string>

DESCRIPTION:
... skipping 35 lines ...
Recording: run_swagger_tests
Running command: run_swagger_tests

+++ Running case: test-cmd.run_swagger_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_swagger_tests
E0113 19:46:28.898687   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 19:46:28] Testing swagger
E0113 19:46:29.022493   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_sort_by_tests
Running command: run_kubectl_sort_by_tests
E0113 19:46:29.155216   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [0113 19:46:29] Testing kubectl --sort-by
E0113 19:46:29.299100   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BNo resources found in namespace-1578944786-8686 namespace.
No resources found in namespace-1578944786-8686 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:29.900014   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0113 19:46:30.023901   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0113 19:46:30.159795   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
E0113 19:46:30.300438   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I0113 19:46:30.288332   86759 loader.go:375] Config loaded from file:  /tmp/tmp.clEhSV5l6N/.kube/config
I0113 19:46:30.301808   86759 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1578944786-8686/pods?includeObject=Object
I0113 19:46:30.301836   86759 round_trippers.go:427] Request Headers:
I0113 19:46:30.301842   86759 round_trippers.go:431]     User-Agent: kubectl/v1.18.0 (linux/amd64) kubernetes/e265afa
I0113 19:46:30.301846   86759 round_trippers.go:431]     Accept: application/json;as=Table;v=v1;g=meta.k8s.io,application/json;as=Table;v=v1beta1;g=meta.k8s.io,application/json
... skipping 23 lines ...
has:includeObject=Object
get.sh:279: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:30.901336   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:31.025324   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:31.162200   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod1 created
E0113 19:46:31.301958   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(Bpod/sorted-pod2 created
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(BE0113 19:46:31.902626   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:32.026584   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod3 created
E0113 19:46:32.163763   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BE0113 19:46:32.303636   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
Successful
message:sorted-pod3:sorted-pod2:sorted-pod1:
has:sorted-pod3:sorted-pod2:sorted-pod1:
... skipping 20 lines ...
I0113 19:46:32.792880   87020 request.go:1022] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"selfLink":"/api/v1/namespaces/namespace-1578944786-8686/pods","resourceVersion":"3163"},"items":[{"metadata":{"name":"sorted-pod1","namespace":"namespace-1578944786-8686","selfLink":"/api/v1/namespaces/namespace-1578944786-8686/pods/sorted-pod1","uid":"4f7287b4-6370-43a3-a3e2-ce8e815b02a7","resourceVersion":"3159","creationTimestamp":"2020-01-13T19:46:31Z","labels":{"name":"sorted-pod3-label"}},"spec":{"containers":[{"name":"kubernetes-pause2","image":"k8s.gcr.io/pause:2.0","resources":{},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File","imagePullPolicy":"IfNotPresent"}],"restartPolicy":"Always","terminationGracePeriodSeconds":30,"dnsPolicy":"ClusterFirst","securityContext":{},"schedulerName":"default-scheduler","priority":0,"enableServiceLinks":true},"status":{"phase":"Pending","qosClass":"BestEffort"}},{"metadata":{"name":"sorted-pod2","namespace":"namespace-1578944786-8686","selfLink":"/api/v1/namespaces/namespace-1578944786- [truncated 1380 chars]
NAME          AGE
sorted-pod2   1s
sorted-pod1   1s
sorted-pod3   0s
has not:Table
E0113 19:46:32.903985   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:325: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BE0113 19:46:33.028094   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "sorted-pod1" force deleted
pod "sorted-pod2" force deleted
pod "sorted-pod3" force deleted
E0113 19:46:33.167610   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:329: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_kubectl_all_namespace_tests
Running command: run_kubectl_all_namespace_tests

E0113 19:46:33.304972   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_all_namespace_tests
+++ [0113 19:46:33] Testing kubectl --all-namespace
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(Bget.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
E0113 19:46:33.905212   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0113 19:46:34.029335   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAMESPACE                   NAME        READY   STATUS    RESTARTS   AGE
namespace-1578944786-8686   valid-pod   0/1     Pending   0          1s
namespace/all-ns-test-1 created
E0113 19:46:34.168969   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount/test created
E0113 19:46:34.306414   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/all-ns-test-2 created
serviceaccount/test created
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
all-ns-test-1                default   0         0s
all-ns-test-1                test      0         0s
... skipping 115 lines ...
namespace-1578944775-11924   default   0         19s
namespace-1578944777-15737   default   0         17s
namespace-1578944786-8686    default   0         8s
some-other-random            default   0         9s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
E0113 19:46:34.906538   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:35.030524   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:35.170428   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:35.307935   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:35.908088   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:36.031972   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:36.171881   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:36.309212   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:36.909451   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:37.033483   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:37.173113   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:37.310206   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:37.910747   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:38.034962   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:38.174240   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:38.311664   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:38.911947   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:39.036341   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:39.175732   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:39.312709   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:39.912734   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:40.037574   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E0113 19:46:40.176941   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:40.313878   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:40.913885   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:41.038754   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:41.178282   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:41.315300   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:41.915147   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:42.040381   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:42.179713   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:42.316833   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:42.916273   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:43.041598   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:43.181130   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:43.318045   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:43.917516   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:44.043180   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:44.182550   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:44.319590   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:44.918766   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:46:44.940370   54883 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E0113 19:46:45.044342   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:45.183612   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:45.324213   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
... skipping 2 lines ...
has not:NAMESPACE
+++ exit code: 0
Recording: run_template_output_tests
Running command: run_template_output_tests

+++ Running case: test-cmd.run_template_output_tests 
E0113 19:46:45.920055   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_template_output_tests
+++ [0113 19:46:45] Testing --template support on commands
+++ [0113 19:46:45] Creating namespace namespace-1578944805-19615
namespace/namespace-1578944805-19615 created
E0113 19:46:46.045258   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
E0113 19:46:46.184816   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0113 19:46:46.325429   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "v1",
... skipping 99 lines ...
    }
}
template-output.sh:35: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:valid-pod:
has:valid-pod:
E0113 19:46:46.921286   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0113 19:46:47.046338   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0113 19:46:47.186216   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0113 19:46:47.326999   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:scale-1:
has:scale-1:
... skipping 2 lines ...
has:redis-slave:
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
Successful
message:nginx:
has:nginx:
E0113 19:46:47.922493   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
Successful
message:pi:
has:pi:
E0113 19:46:48.047699   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:127.0.0.1:
has:127.0.0.1:
E0113 19:46:48.187535   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
E0113 19:46:48.328266   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0113 19:46:48.459871   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944805-19615", Name:"cassandra", UID:"5cc07f24-4072-472a-a0c2-4aee5a1247a1", APIVersion:"v1", ResourceVersion:"3212", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-58w5p
I0113 19:46:48.468427   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944805-19615", Name:"cassandra", UID:"5cc07f24-4072-472a-a0c2-4aee5a1247a1", APIVersion:"v1", ResourceVersion:"3212", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-hxfks
Successful
message:cassandra:
has:cassandra:
... skipping 12 lines ...
Successful
message:testing-CR:testing-CRB:testing-RB:testing-R:
has:testing-CR:testing-CRB:testing-RB:testing-R:
Successful
message:myclusterrole:
has:myclusterrole:
E0113 19:46:48.924019   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0113 19:46:49.049182   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:cm:
has:cm:
E0113 19:46:49.188825   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:46:49.212762   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944805-19615", Name:"deploy", UID:"f1272ec3-88f9-4e58-9968-f51a61df673a", APIVersion:"apps/v1", ResourceVersion:"3221", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
I0113 19:46:49.218537   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944805-19615", Name:"deploy-74bcc58696", UID:"eb715dad-a0c9-4da0-a70e-7851b0ffbcea", APIVersion:"apps/v1", ResourceVersion:"3222", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-ks72v
Successful
message:deploy:
has:deploy:
E0113 19:46:49.329724   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch/pi created
Successful
message:foo:
has:foo:
Successful
message:bar:
has:bar:
Successful
message:foo:
has:foo:
Successful
message:myrole:
has:myrole:
E0113 19:46:49.925429   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0113 19:46:50.050520   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0113 19:46:50.189986   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0113 19:46:50.193590   54883 namespace_controller.go:185] Namespace has been deleted all-ns-test-2
Successful
message:foo:
has:foo:
E0113 19:46:50.331248   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:valid-pod:
has:valid-pod:
... skipping 6 lines ...
Successful
message:kubernetes:
has:kubernetes:
Successful
message:valid-pod:
has:valid-pod:
E0113 19:46:50.926784   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0113 19:46:51.051920   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0113 19:46:51.191556   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0113 19:46:51.332549   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 21 lines ...
preferences: {}
users: null
has:kind: Config
Successful
message:deploy:
has:deploy:
E0113 19:46:51.928270   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0113 19:46:52.053184   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0113 19:46:52.192785   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0113 19:46:52.333776   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Config:
has:Config
Successful
message:apiVersion: v1
kind: ConfigMap
... skipping 7 lines ...
pod "cassandra-hxfks" deleted
I0113 19:46:52.695686   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578944805-19615", Name:"cassandra", UID:"5cc07f24-4072-472a-a0c2-4aee5a1247a1", APIVersion:"v1", ResourceVersion:"3218", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-llvqv
pod "deploy-74bcc58696-ks72v" deleted
I0113 19:46:52.702277   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944805-19615", Name:"deploy-74bcc58696", UID:"eb715dad-a0c9-4da0-a70e-7851b0ffbcea", APIVersion:"apps/v1", ResourceVersion:"3228", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-lrmkq
pod "valid-pod" deleted
replicationcontroller "cassandra" deleted
E0113 19:46:52.929354   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io "myclusterrole" deleted
clusterrolebinding.rbac.authorization.k8s.io "foo" deleted
E0113 19:46:53.054498   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "deploy" deleted
+++ exit code: 0
E0113 19:46:53.194098   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_certificates_tests
Running command: run_certificates_tests

+++ Running case: test-cmd.run_certificates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
+++ [0113 19:46:53] Testing certificates
E0113 19:46:53.335107   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
... skipping 34 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0113 19:46:53.930642   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE0113 19:46:54.056208   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0113 19:46:54.195565   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 19:46:54.336250   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
... skipping 53 lines ...
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE0113 19:46:54.931843   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0113 19:46:55.057443   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 19:46:55.196829   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:55.337549   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
... skipping 54 lines ...
        "resourceVersion": "",
        "selfLink": ""
    }
}
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
E0113 19:46:55.933501   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 19:46:56.058641   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:56.198167   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0113 19:46:56.339326   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
        {
... skipping 54 lines ...
        "selfLink": ""
    }
}
certificate.sh:57: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:59: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0113 19:46:56.934803   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_cluster_management_tests
Running command: run_cluster_management_tests

+++ Running case: test-cmd.run_cluster_management_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
E0113 19:46:57.060083   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0113 19:46:57] Testing cluster-management commands
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0113 19:46:57.199593   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:46:57.340640   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-1 created
pod/test-pod-2 created
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
E0113 19:46:57.936092   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(BE0113 19:46:58.061395   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
E0113 19:46:58.200799   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(BE0113 19:46:58.342067   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0113 19:46:58.937311   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 19:46:59.062778   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 19:46:59.202194   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(BE0113 19:46:59.343502   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned
node/127.0.0.1 drained
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(Bpod "test-pod-2" deleted
node/127.0.0.1 uncordoned
E0113 19:46:59.938974   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 19:47:00.064144   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0113 19:47:00.203739   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
E0113 19:47:00.344947   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
E0113 19:47:00.940290   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
E0113 19:47:01.065410   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:node/127.0.0.1 cordoned
has:node/127.0.0.1 cordoned
E0113 19:47:01.205154   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:
has not:cordoned
E0113 19:47:01.346311   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(B+++ exit code: 0
Recording: run_plugins_tests
Running command: run_plugins_tests

+++ Running case: test-cmd.run_plugins_tests 
... skipping 3 lines ...
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
E0113 19:47:01.941494   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
E0113 19:47:02.067579   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
has:test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
E0113 19:47:02.206581   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Client Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.1.638+e265afa2cdfb2b", GitCommit:"e265afa2cdfb2b08c05aa3aeddaacdd26f22746e", GitTreeState:"clean", BuildDate:"2020-01-13T17:37:38Z", GoVersion:"go1.13.5", Compiler:"gc", Platform:"linux/amd64"}
has:Client Version
Successful
message:Client Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.1.638+e265afa2cdfb2b", GitCommit:"e265afa2cdfb2b08c05aa3aeddaacdd26f22746e", GitTreeState:"clean", BuildDate:"2020-01-13T17:37:38Z", GoVersion:"go1.13.5", Compiler:"gc", Platform:"linux/amd64"}
has not:overshadows an existing plugin
+++ exit code: 0
Recording: run_impersonation_tests
Running command: run_impersonation_tests
E0113 19:47:02.347743   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0113 19:47:02] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
certificatesigningrequest.certificates.k8s.io/foo created
E0113 19:47:02.942687   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(BE0113 19:47:03.068841   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
E0113 19:47:03.208206   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:03.349183   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(Bauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
+++ exit code: 0
Recording: run_wait_tests
Running command: run_wait_tests

+++ Running case: test-cmd.run_wait_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_wait_tests
+++ [0113 19:47:03] Testing kubectl wait
+++ [0113 19:47:03] Creating namespace namespace-1578944823-10999
E0113 19:47:03.944052   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578944823-10999 created
E0113 19:47:04.070202   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
deployment.apps/test-1 created
I0113 19:47:04.184978   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944823-10999", Name:"test-1", UID:"43a600db-3c29-4095-b05a-e40f77cff7a1", APIVersion:"apps/v1", ResourceVersion:"3318", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
I0113 19:47:04.194698   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944823-10999", Name:"test-1-6d98955cc9", UID:"687f31f8-3b1f-4901-95e6-128c1c8b00c8", APIVersion:"apps/v1", ResourceVersion:"3319", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-xfpjd
E0113 19:47:04.209798   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-2 created
I0113 19:47:04.289809   54883 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578944823-10999", Name:"test-2", UID:"0fed3f09-0cf0-49a8-b95e-a047f6ea432c", APIVersion:"apps/v1", ResourceVersion:"3328", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
I0113 19:47:04.292923   54883 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578944823-10999", Name:"test-2-65897ff84d", UID:"cd6f5045-d96d-4c49-9b16-31aaa2ad7f15", APIVersion:"apps/v1", ResourceVersion:"3329", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-bftv9
E0113 19:47:04.350533   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE0113 19:47:04.945371   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:05.071849   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:05.210998   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:05.351949   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:05.946733   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:06.073174   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:06.212321   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0113 19:47:06.353293   54883 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 28 lines ...
I0113 19:47:06.916600   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.916625   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.916727   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.916956   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.916991   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.917034   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.917244   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.917262   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.917397   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.917421   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.917450   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.917529   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.917545   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.917611   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.917624   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.917674   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.917676   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.917726   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.917798   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.917807   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.917850   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.917861   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.917906   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.918141   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.918157   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.918261   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.918285   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.918356   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.918401   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
... skipping 4 lines ...
I0113 19:47:06.918811   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.918978   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.919107   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.919188   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.919190   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.919350   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.919395   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919414   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919453   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919480   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919500   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919534   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919556   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919593   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919606   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919633   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919639   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919651   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919672   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919697   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919701   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919709   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919721   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919748   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919773   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919642   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919799   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0113 19:47:06.919847   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.919856   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.919865   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0113 19:47:06.919892   51369 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0113 19:47:06.919935   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.919940   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.919963   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.920012   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.920015   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.920036   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.920044   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.920062   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.920109   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
E0113 19:47:06.920128   51369 controller.go:183] rpc error: code = Unavailable desc = transport is closing
I0113 19:47:06.920156   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0113 19:47:06.920167   51369 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
junit report dir: /logs/artifacts
+++ [0113 19:47:06] Clean up complete
+ make test-integration
+++ [0113 19:47:12] Checking etcd is on PATH
... skipping 313 lines ...
    synthetic_master_test.go:721: UPDATE_NODE_APISERVER is not set

=== SKIP: test/integration/scheduler_perf TestSchedule100Node3KPods (0.00s)
    scheduler_test.go:73: Skipping because we want to run short tests


=== Failed
=== FAIL: test/integration/client TestDynamicClient (8.33s)
I0113 19:50:03.088236  106636 controller.go:123] Shutting down OpenAPI controller
I0113 19:50:03.088257  106636 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0113 19:50:03.088285  106636 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0113 19:50:03.088301  106636 autoregister_controller.go:164] Shutting down autoregister controller
I0113 19:50:03.088317  106636 naming_controller.go:300] Shutting down NamingConditionController
I0113 19:50:03.088330  106636 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
... skipping 7 lines ...
I0113 19:50:03.088994  106636 secure_serving.go:222] Stopped listening on 127.0.0.1:37887
I0113 19:50:03.089012  106636 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0113 19:50:03.089033  106636 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver874119105/apiserver.crt::/tmp/kubernetes-kube-apiserver874119105/apiserver.key
I0113 19:50:03.089064  106636 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver874119105/client-ca.crt
I0113 19:50:03.088410  106636 available_controller.go:398] Shutting down AvailableConditionController
I0113 19:50:03.088417  106636 crd_finalizer.go:276] Shutting down CRDFinalizer
E0113 19:50:03.963295  106636 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
I0113 19:50:04.602664  106636 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver931190316/apiserver.crt, /tmp/kubernetes-kube-apiserver931190316/apiserver.key)
I0113 19:50:04.602709  106636 server.go:596] external host was not specified, using 127.0.0.1
W0113 19:50:04.602720  106636 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0113 19:50:05.100880  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.100915  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0113 19:50:05.100929  106636 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
... skipping 205 lines ...
I0113 19:50:10.241102  106636 controller.go:86] Starting OpenAPI controller
I0113 19:50:10.242015  106636 naming_controller.go:289] Starting NamingConditionController
I0113 19:50:10.242051  106636 establishing_controller.go:74] Starting EstablishingController
I0113 19:50:10.242074  106636 nonstructuralschema_controller.go:185] Starting NonStructuralSchemaConditionController
I0113 19:50:10.242098  106636 apiapproval_controller.go:184] Starting KubernetesAPIApprovalPolicyConformantConditionController
E0113 19:50:10.251437  106636 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /d8e1f1b4-2fb9-4b9c-b5f5-cff0a0ecb6cf/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
E0113 19:50:10.261945  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.272255  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.283335  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.286734  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0113 19:50:10.307695  106636 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0113 19:50:10.340834  106636 cache.go:39] Caches are synced for AvailableConditionController controller
I0113 19:50:10.340999  106636 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0113 19:50:10.341097  106636 cache.go:39] Caches are synced for autoregister controller
I0113 19:50:10.342021  106636 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0113 19:50:10.343244  106636 shared_informer.go:213] Caches are synced for crd-autoregister 
I0113 19:50:11.236684  106636 controller.go:107] OpenAPI AggregationController: Processing item 
... skipping 46 lines ...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test47h2w", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test47h2w", UID:"0cb6d7a8-f2f0-4326-a336-3ec9c27332ad", ResourceVersion:"8632", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714541811, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc044a52fa0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc044a52fc0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc048e5b008), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0442b8a20), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc048e5b030)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc048e5b050)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc048e5b058), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc048e5b05c), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test47h2w", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test47h2w", UID:"0cb6d7a8-f2f0-4326-a336-3ec9c27332ad", ResourceVersion:"8632", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714541811, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc044b05f20), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc044b05f00)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc04912f208), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0443a3500), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04912f250)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04912f270)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc04912f1e8), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc04912f1c9), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}


DONE 2486 tests, 4 skipped, 1 failure in 5.335s
+++ [0113 19:59:18] Saved JUnit XML test report to /logs/artifacts/junit_da39a3ee5e6b4b0d3255bfef95601890afd80709_20200113-194718.xml
make[1]: *** [Makefile:185: test] Error 1
!!! [0113 19:59:18] Call tree:
!!! [0113 19:59:18]  1: hack/make-rules/test-integration.sh:97 runTests(...)
+++ [0113 19:59:18] Cleaning up etcd
+++ [0113 19:59:19] Integration test cleanup complete
make: *** [Makefile:204: test-integration] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
[Barnacle] 2020/01/13 19:59:19 Cleaning up Docker data root...
[Barnacle] 2020/01/13 19:59:19 Removing all containers.
... skipping 12 lines ...