This job view page is being replaced by Spyglass soon. Check out the new job view.
PRrobscott: Promoting EndpointSlices to beta
ResultFAILURE
Tests 1 failed / 2898 succeeded
Started2019-11-09 01:33
Elapsed24m16s
Revision0dc12965ccdbbf063ecde5e5a033f57da82d1952
Refs 84390

Test Failures


k8s.io/kubernetes/test/integration/etcd TestEtcdStoragePath 12s

go test -v k8s.io/kubernetes/test/integration/etcd -run TestEtcdStoragePath$
=== RUN   TestEtcdStoragePath
E1109 01:50:32.625962  107409 controller.go:183] Get https://127.0.0.1:35291/api/v1/namespaces/default/endpoints/kubernetes: dial tcp 127.0.0.1:35291: connect: connection refused
I1109 01:50:33.144486  107409 serving.go:306] Generated self-signed cert (/tmp/TestEtcdStoragePath937205931/apiserver.crt, /tmp/TestEtcdStoragePath937205931/apiserver.key)
I1109 01:50:33.144531  107409 server.go:622] external host was not specified, using 10.61.24.125
I1109 01:50:33.144932  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.144980  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W1109 01:50:33.740660  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.740706  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.740788  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.741149  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.741177  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.741187  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.741198  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.741290  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.742695  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.742749  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.742778  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.742885  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.743212  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.743517  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.743606  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W1109 01:50:33.743764  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I1109 01:50:33.743785  107409 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I1109 01:50:33.743796  107409 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I1109 01:50:33.745303  107409 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I1109 01:50:33.745330  107409 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I1109 01:50:33.747054  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.747102  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.748477  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.748513  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W1109 01:50:33.789193  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I1109 01:50:33.790603  107409 master.go:265] Using reconciler: lease
I1109 01:50:33.791044  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.791138  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.794626  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.794700  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.796773  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.796810  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.798475  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.798802  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.806206  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.806248  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.807498  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.807527  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.808784  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.808815  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.810265  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.810297  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.812305  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.812329  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.813861  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.813892  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.814766  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.814810  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.816082  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.816104  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.817680  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.817703  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.818779  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.818816  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.819878  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.819903  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.821414  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.821535  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.822779  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.822809  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.823628  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.823656  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:33.824160  107409 rest.go:115] the default service ipfamily for this cluster is: IPv4
I1109 01:50:34.005494  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.005539  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.006929  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.006971  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.008950  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.008986  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.010354  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.010409  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.012223  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.012249  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.013274  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.013296  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.014311  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.014341  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.016004  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.016033  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.017415  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.017600  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.018678  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.018708  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.021682  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.023612  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.024995  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.025020  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.027151  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.027181  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.028861  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.028890  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.029972  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.030001  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.031582  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.031605  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.033003  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.033029  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.034150  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.034179  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.035036  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.035060  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.036115  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.036543  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.037486  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.037605  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.039387  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.039611  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.040670  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.040700  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.042027  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.042144  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.043512  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.043534  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.044741  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.044778  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.045768  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.045796  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.046877  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.046898  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.048001  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.048027  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.052202  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.060465  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.062201  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.062299  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.063856  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.063888  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.065120  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.065150  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.066055  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.066077  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.069497  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.069522  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.072330  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.072364  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.074868  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.074905  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.075903  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.075933  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.077221  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.077261  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.079228  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.079305  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.082372  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.082682  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.085555  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.086667  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.087803  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.087858  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.089889  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.089918  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.092391  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.092488  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.093720  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.093742  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.094821  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.095049  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.096030  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.096051  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.097562  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.097654  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.098914  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.099069  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.100625  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.100724  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.101810  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.101909  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.105229  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.105269  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.106534  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.106562  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.112420  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.112471  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.114804  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.114839  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.116476  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.116574  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.118018  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.118062  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.119469  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.119614  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.120949  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.120982  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.122488  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.122517  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.123394  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.123465  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.124647  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.124679  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.125772  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.125800  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.126790  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.126833  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W1109 01:50:34.429880  107409 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
I1109 01:50:34.740191  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.740285  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.795004  107409 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I1109 01:50:34.795036  107409 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
W1109 01:50:34.796597  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I1109 01:50:34.796754  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.796796  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:50:34.797888  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:34.797927  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W1109 01:50:34.801240  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I1109 01:50:39.013553  107409 dynamic_serving_content.go:129] Starting serving-cert::/tmp/TestEtcdStoragePath937205931/apiserver.crt::/tmp/TestEtcdStoragePath937205931/apiserver.key
I1109 01:50:39.014055  107409 secure_serving.go:174] Serving securely on 127.0.0.1:38885
I1109 01:50:39.014130  107409 controller.go:81] Starting OpenAPI AggregationController
I1109 01:50:39.014196  107409 tlsconfig.go:220] Starting DynamicServingCertificateController
I1109 01:50:39.014601  107409 crd_finalizer.go:263] Starting CRDFinalizer
I1109 01:50:39.014954  107409 controller.go:85] Starting OpenAPI controller
I1109 01:50:39.014973  107409 nonstructuralschema_controller.go:191] Starting NonStructuralSchemaConditionController
I1109 01:50:39.014980  107409 customresource_discovery_controller.go:208] Starting DiscoveryController
I1109 01:50:39.015001  107409 apiapproval_controller.go:185] Starting KubernetesAPIApprovalPolicyConformantConditionController
I1109 01:50:39.015019  107409 naming_controller.go:288] Starting NamingConditionController
I1109 01:50:39.014934  107409 establishing_controller.go:73] Starting EstablishingController
I1109 01:50:39.015325  107409 autoregister_controller.go:140] Starting autoregister controller
I1109 01:50:39.015337  107409 cache.go:32] Waiting for caches to sync for autoregister controller
I1109 01:50:39.015359  107409 apiservice_controller.go:94] Starting APIServiceRegistrationController
I1109 01:50:39.015363  107409 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I1109 01:50:39.015393  107409 available_controller.go:386] Starting AvailableConditionController
I1109 01:50:39.015398  107409 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I1109 01:50:39.015789  107409 crdregistration_controller.go:111] Starting crd-autoregister controller
I1109 01:50:39.015801  107409 shared_informer.go:197] Waiting for caches to sync for crd-autoregister
W1109 01:50:39.016662  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I1109 01:50:39.016754  107409 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I1109 01:50:39.016762  107409 shared_informer.go:197] Waiting for caches to sync for cluster_authentication_trust_controller
E1109 01:50:39.017984  107409 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /registry/masterleases/10.61.24.125, ResourceVersion: 0, AdditionalErrorMsg: 
I1109 01:50:39.115523  107409 cache.go:39] Caches are synced for autoregister controller
I1109 01:50:39.116841  107409 cache.go:39] Caches are synced for AvailableConditionController controller
I1109 01:50:39.116885  107409 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I1109 01:50:39.118027  107409 shared_informer.go:204] Caches are synced for cluster_authentication_trust_controller 
I1109 01:50:39.118245  107409 shared_informer.go:204] Caches are synced for crd-autoregister 
I1109 01:50:40.013195  107409 controller.go:107] OpenAPI AggregationController: Processing item 
I1109 01:50:40.013234  107409 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I1109 01:50:40.013252  107409 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I1109 01:50:40.022137  107409 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I1109 01:50:40.031180  107409 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I1109 01:50:40.031208  107409 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
I1109 01:50:40.577086  107409 controller.go:606] quota admission added evaluator for: roles.rbac.authorization.k8s.io
I1109 01:50:40.638949  107409 controller.go:606] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
W1109 01:50:40.885673  107409 lease.go:222] Resetting endpoints for master service "kubernetes" to [10.61.24.125]
I1109 01:50:40.886981  107409 controller.go:606] quota admission added evaluator for: endpoints
I1109 01:50:40.891560  107409 controller.go:606] quota admission added evaluator for: endpointslices.discovery.k8s.io
--- FAIL: TestEtcdStoragePath (12.54s)
    server.go:155: waiting for server to be healthy
    server.go:155: waiting for server to be healthy

				from junit_304dbea7698c16157bb4586f231ea1f94495b046_20191109-014650.xml

Filter through log files | View test history on testgrid


Show 2898 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 56 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [1109 01:37:43] Call tree:
!!! [1109 01:37:43]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [1109 01:37:43]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [1109 01:37:43]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [1109 01:37:43]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [1109 01:37:43]  5: hack/make-rules/test-cmd.sh:27 source(...)
+++ exit code: 1
+++ error: 1
+++ [1109 01:37:43] Running kubeadm tests
+++ [1109 01:37:47] Building go targets for linux/amd64:
    cmd/kubeadm
Running tests for APIVersion: v1,admissionregistration.k8s.io/v1,admissionregistration.k8s.io/v1beta1,admission.k8s.io/v1,admission.k8s.io/v1beta1,apps/v1,apps/v1beta1,apps/v1beta2,auditregistration.k8s.io/v1alpha1,authentication.k8s.io/v1,authentication.k8s.io/v1beta1,authorization.k8s.io/v1,authorization.k8s.io/v1beta1,autoscaling/v1,autoscaling/v2beta1,autoscaling/v2beta2,batch/v1,batch/v1beta1,batch/v2alpha1,certificates.k8s.io/v1beta1,coordination.k8s.io/v1beta1,coordination.k8s.io/v1,discovery.k8s.io/v1alpha1,discovery.k8s.io/v1beta1,extensions/v1beta1,events.k8s.io/v1beta1,imagepolicy.k8s.io/v1alpha1,networking.k8s.io/v1,networking.k8s.io/v1beta1,node.k8s.io/v1alpha1,node.k8s.io/v1beta1,policy/v1beta1,rbac.authorization.k8s.io/v1,rbac.authorization.k8s.io/v1beta1,rbac.authorization.k8s.io/v1alpha1,scheduling.k8s.io/v1alpha1,scheduling.k8s.io/v1beta1,scheduling.k8s.io/v1,settings.k8s.io/v1alpha1,storage.k8s.io/v1beta1,storage.k8s.io/v1,storage.k8s.io/v1alpha1,flowcontrol.apiserver.k8s.io/v1alpha1,
+++ [1109 01:38:29] Running tests without code coverage
{"Time":"2019-11-09T01:39:43.643473112Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t37.578s\n"}
... skipping 286 lines ...
+++ [1109 01:41:21] Building kube-controller-manager
+++ [1109 01:41:25] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [1109 01:41:51] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I1109 01:41:52.491479   54890 serving.go:312] Generated self-signed cert in-memory
W1109 01:41:53.360497   54890 authentication.go:457] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W1109 01:41:53.360538   54890 authentication.go:319] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W1109 01:41:53.360544   54890 authentication.go:322] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W1109 01:41:53.360560   54890 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W1109 01:41:53.360588   54890 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I1109 01:41:53.360609   54890 controllermanager.go:161] Version: v1.18.0-alpha.0.557+ff7867612869b2
I1109 01:41:53.361672   54890 secure_serving.go:174] Serving securely on [::]:10257
I1109 01:41:53.361788   54890 tlsconfig.go:220] Starting DynamicServingCertificateController
I1109 01:41:53.362146   54890 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I1109 01:41:53.362224   54890 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 7 lines ...
I1109 01:41:53.631757   54890 controllermanager.go:533] Started "podgc"
I1109 01:41:53.631790   54890 core.go:213] Will not configure cloud provider routes for allocate-node-cidrs: false, configure-cloud-routes: true.
W1109 01:41:53.631800   54890 controllermanager.go:525] Skipping "route"
I1109 01:41:53.631985   54890 gc_controller.go:88] Starting GC controller
I1109 01:41:53.632017   54890 shared_informer.go:197] Waiting for caches to sync for GC
I1109 01:41:53.632188   54890 node_lifecycle_controller.go:77] Sending events to api server
E1109 01:41:53.632225   54890 core.go:203] failed to start cloud node lifecycle controller: no cloud provider provided
W1109 01:41:53.632236   54890 controllermanager.go:525] Skipping "cloud-node-lifecycle"
I1109 01:41:53.632766   54890 controllermanager.go:533] Started "ttl"
I1109 01:41:53.632954   54890 ttl_controller.go:116] Starting TTL controller
I1109 01:41:53.632972   54890 shared_informer.go:197] Waiting for caches to sync for TTL
I1109 01:41:53.633217   54890 node_lifecycle_controller.go:388] Sending events to api server.
I1109 01:41:53.633376   54890 node_lifecycle_controller.go:423] Controller is using taint based evictions.
... skipping 111 lines ...
}I1109 01:41:54.411065   54890 controllermanager.go:533] Started "garbagecollector"
I1109 01:41:54.411354   54890 garbagecollector.go:129] Starting garbage collector controller
I1109 01:41:54.411380   54890 shared_informer.go:197] Waiting for caches to sync for garbage collector
I1109 01:41:54.411422   54890 graph_builder.go:282] GraphBuilder running
I1109 01:41:54.411873   54890 controllermanager.go:533] Started "cronjob"
I1109 01:41:54.412100   54890 cronjob_controller.go:97] Starting CronJob Manager
E1109 01:41:54.413978   54890 core.go:81] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W1109 01:41:54.414004   54890 controllermanager.go:525] Skipping "service"
I1109 01:41:54.414471   54890 controllermanager.go:533] Started "csrapproving"
I1109 01:41:54.415117   54890 certificate_controller.go:118] Starting certificate controller "csrapproving"
I1109 01:41:54.415138   54890 shared_informer.go:197] Waiting for caches to sync for certificate-csrapproving
I1109 01:41:54.415322   54890 controllermanager.go:533] Started "clusterrole-aggregation"
I1109 01:41:54.415565   54890 clusterroleaggregation_controller.go:148] Starting ClusterRoleAggregator
... skipping 2 lines ...
I1109 01:41:54.416494   54890 pv_protection_controller.go:81] Starting PV protection controller
I1109 01:41:54.416509   54890 shared_informer.go:197] Waiting for caches to sync for PV protection
I1109 01:41:54.497946   54890 shared_informer.go:204] Caches are synced for expand 
I1109 01:41:54.515413   54890 shared_informer.go:204] Caches are synced for certificate-csrapproving 
I1109 01:41:54.515758   54890 shared_informer.go:204] Caches are synced for ClusterRoleAggregator 
I1109 01:41:54.516669   54890 shared_informer.go:204] Caches are synced for PV protection 
W1109 01:41:54.520354   54890 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
E1109 01:41:54.524995   54890 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
E1109 01:41:54.531184   54890 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
I1109 01:41:54.533150   54890 shared_informer.go:204] Caches are synced for TTL 
+++ [1109 01:41:54] Testing kubectl version: check client only output matches expected output
Successful: the flag '--client' shows correct client info
(BSuccessful: the flag '--client' correctly has no server version info
(B+++ [1109 01:41:54] Testing kubectl version: verify json output
Successful: --output json has correct client info
... skipping 81 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [1109 01:41:58] Creating namespace namespace-1573263718-5402
namespace/namespace-1573263718-5402 created
Context "test" modified.
+++ [1109 01:41:58] Testing RESTMapper
+++ [1109 01:41:58] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 602 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [1109 01:42:35] "kubectl patch with resourceVersion 531" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
W1109 01:42:36.219407   54890 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test created
node/node-v1-test replaced
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
... skipping 23 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 85 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [1109 01:42:45] Creating namespace namespace-1573263765-11776
namespace/namespace-1573263765-11776 created
Context "test" modified.
+++ [1109 01:42:45] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [1109 01:42:46] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I1109 01:42:48.687797   51457 client.go:361] parsed scheme: "endpoint"
I1109 01:42:48.687849   51457 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1109 01:42:48.692026   51457 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 102 lines ...
Context "test" modified.
+++ [1109 01:42:51] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 29 lines ...
I1109 01:42:54.198585   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263771-7115", Name:"nginx", UID:"921116b1-eae8-4ddd-850c-60f917404d66", APIVersion:"apps/v1", ResourceVersion:"622", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-8484dd655 to 3
I1109 01:42:54.201774   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263771-7115", Name:"nginx-8484dd655", UID:"6f74f703-e647-473c-b006-80a32a5e6eb2", APIVersion:"apps/v1", ResourceVersion:"623", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-57rs2
I1109 01:42:54.205546   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263771-7115", Name:"nginx-8484dd655", UID:"6f74f703-e647-473c-b006-80a32a5e6eb2", APIVersion:"apps/v1", ResourceVersion:"623", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-zk5l5
I1109 01:42:54.206568   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263771-7115", Name:"nginx-8484dd655", UID:"6f74f703-e647-473c-b006-80a32a5e6eb2", APIVersion:"apps/v1", ResourceVersion:"623", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-xm8v4
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1573263771-7115\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1573263771-7115"
Object: &{map["apiVersion":"apps/v1" "kind":"Deployment" "metadata":map["annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1573263771-7115\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx1\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "creationTimestamp":"2019-11-09T01:42:54Z" "generation":'\x01' "labels":map["name":"nginx"] "name":"nginx" "namespace":"namespace-1573263771-7115" "resourceVersion":"635" "selfLink":"/apis/apps/v1/namespaces/namespace-1573263771-7115/deployments/nginx" "uid":"921116b1-eae8-4ddd-850c-60f917404d66"] "spec":map["progressDeadlineSeconds":'\u0258' "replicas":'\x03' "revisionHistoryLimit":'\n' "selector":map["matchLabels":map["name":"nginx1"]] "strategy":map["rollingUpdate":map["maxSurge":"25%" "maxUnavailable":"25%"] "type":"RollingUpdate"] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "imagePullPolicy":"IfNotPresent" "name":"nginx" "ports":[map["containerPort":'P' "protocol":"TCP"]] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File"]] "dnsPolicy":"ClusterFirst" "restartPolicy":"Always" "schedulerName":"default-scheduler" "securityContext":map[] "terminationGracePeriodSeconds":'\x1e']]] "status":map["conditions":[map["lastTransitionTime":"2019-11-09T01:42:54Z" "lastUpdateTime":"2019-11-09T01:42:54Z" "message":"Deployment does not have minimum availability." "reason":"MinimumReplicasUnavailable" "status":"False" "type":"Available"] map["lastTransitionTime":"2019-11-09T01:42:54Z" "lastUpdateTime":"2019-11-09T01:42:54Z" "message":"ReplicaSet \"nginx-8484dd655\" is progressing." "reason":"ReplicaSetUpdated" "status":"True" "type":"Progressing"]] "observedGeneration":'\x01' "replicas":'\x03' "unavailableReplicas":'\x03' "updatedReplicas":'\x03']]}
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
I1109 01:43:00.069112   54890 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1573263763-23248
deployment.apps/nginx configured
I1109 01:43:03.713924   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263771-7115", Name:"nginx", UID:"dfaf0870-7574-4eca-ab8f-6f03b27ae4b0", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I1109 01:43:03.717081   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263771-7115", Name:"nginx-668b6c7744", UID:"a2f300b2-85d9-4dd6-83dc-cbab76f73aef", APIVersion:"apps/v1", ResourceVersion:"665", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-tjp8c
I1109 01:43:03.720823   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263771-7115", Name:"nginx-668b6c7744", UID:"a2f300b2-85d9-4dd6-83dc-cbab76f73aef", APIVersion:"apps/v1", ResourceVersion:"665", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-cgdbj
I1109 01:43:03.721638   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263771-7115", Name:"nginx-668b6c7744", UID:"a2f300b2-85d9-4dd6-83dc-cbab76f73aef", APIVersion:"apps/v1", ResourceVersion:"665", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-tqx7x
... skipping 142 lines ...
+++ [1109 01:43:10] Creating namespace namespace-1573263790-10806
namespace/namespace-1573263790-10806 created
Context "test" modified.
+++ [1109 01:43:11] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1573263790-10806 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1573263790-10806 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I1109 01:43:12.941197   65270 loader.go:375] Config loaded from file:  /tmp/tmp.JlPNjhJbR0/.kube/config
I1109 01:43:12.943031   65270 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I1109 01:43:12.972998   65270 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
I1109 01:43:12.974744   65270 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 653 lines ...
Successful
message:NAME    DATA   AGE
one     0      0s
three   0      0s
two     0      0s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [1109 01:43:19] Creating namespace namespace-1573263799-24208
namespace/namespace-1573263799-24208 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 56 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2019-11-09T01:43:20Z", "labels":map[string]interface {}{"name":"valid-pod"}, "name":"valid-pod", "namespace":"namespace-1573263799-24208", "resourceVersion":"752", "selfLink":"/api/v1/namespaces/namespace-1573263799-24208/pods/valid-pod", "uid":"f8b2a00a-63db-432f-ba41-6ca390387394"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-11-09T01:43:20Z","labels":{"name":"valid-pod"},"name":"valid-pod","namespace":"namespace-1573263799-24208","resourceVersion":"752","selfLink":"/api/v1/namespaces/namespace-1573263799-24208/pods/valid-pod","uid":"f8b2a00a-63db-432f-ba41-6ca390387394"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2019-11-09T01:43:20Z labels:map[name:valid-pod] name:valid-pod namespace:namespace-1573263799-24208 resourceVersion:752 selfLink:/api/v1/namespaces/namespace-1573263799-24208/pods/valid-pod uid:f8b2a00a-63db-432f-ba41-6ca390387394] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 45 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 42 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [1109 01:43:25] Creating namespace namespace-1573263805-16762
namespace/namespace-1573263805-16762 created
Context "test" modified.
+++ [1109 01:43:25] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [1109 01:43:26] Creating namespace namespace-1573263806-6456
namespace/namespace-1573263806-6456 created
Context "test" modified.
+++ [1109 01:43:26] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I1109 01:43:27.566401   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263806-6456", Name:"frontend", UID:"9df903f0-a570-4ec3-9993-ead97cd429bb", APIVersion:"apps/v1", ResourceVersion:"813", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-m6pvw
I1109 01:43:27.570835   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263806-6456", Name:"frontend", UID:"9df903f0-a570-4ec3-9993-ead97cd429bb", APIVersion:"apps/v1", ResourceVersion:"813", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8f4kj
I1109 01:43:27.571004   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263806-6456", Name:"frontend", UID:"9df903f0-a570-4ec3-9993-ead97cd429bb", APIVersion:"apps/v1", ResourceVersion:"813", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kcnmx
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-8f4kj does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-8f4kj does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"10ba7b76-a596-4171-bd1f-e8e36f65f4e7","resourceVersion":"835","creationTimestamp":"2019-11-09T01:43:29Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"10ba7b76-a596-4171-bd1f-e8e36f65f4e7","resourceVersion":"836","creationTimestamp":"2019-11-09T01:43:29Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"10ba7b76-a596-4171-bd1f-e8e36f65f4e7"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 110 lines ...
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          2s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 158 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [1109 01:43:38] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 190 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I1109 01:44:09.344588   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-26674", Name:"test1-6cdffdb5b8", UID:"27e6585d-52f6-48d4-bafd-4dabe722f21c", APIVersion:"apps/v1", ResourceVersion:"1016", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-d4xsx
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
+++ [1109 01:44:09] Testing recursive resources
+++ [1109 01:44:09] Creating namespace namespace-1573263849-3277
namespace/namespace-1573263849-3277 created
Context "test" modified.
W1109 01:44:09.757935   51457 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1109 01:44:09.759271   54890 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW1109 01:44:09.854868   51457 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1109 01:44:09.856519   54890 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W1109 01:44:09.945195   51457 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1109 01:44:09.946414   54890 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W1109 01:44:10.042213   51457 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1109 01:44:10.043619   54890 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E1109 01:44:10.760620   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1109 01:44:10.857715   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Name:         busybox0
Namespace:    namespace-1573263849-3277
Priority:     0
Node:         <none>
Labels:       app=busybox0
... skipping 153 lines ...
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E1109 01:44:10.947490   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1109 01:44:11.044910   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
pod/busybox0 configured
Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
pod/busybox1 configured
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:44:11.761870   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I1109 01:44:11.850179   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263849-3277", Name:"nginx", UID:"ec7d4c05-2a7f-4334-a956-e0e8b164cf37", APIVersion:"apps/v1", ResourceVersion:"1042", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I1109 01:44:11.856218   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-3277", Name:"nginx-f87d999f7", UID:"0cb252e4-9f0d-4188-9c29-ffc71f15fe8a", APIVersion:"apps/v1", ResourceVersion:"1043", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-9p6c9
E1109 01:44:11.859043   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:11.859902   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-3277", Name:"nginx-f87d999f7", UID:"0cb252e4-9f0d-4188-9c29-ffc71f15fe8a", APIVersion:"apps/v1", ResourceVersion:"1043", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-77gjk
I1109 01:44:11.860810   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-3277", Name:"nginx-f87d999f7", UID:"0cb252e4-9f0d-4188-9c29-ffc71f15fe8a", APIVersion:"apps/v1", ResourceVersion:"1043", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-fffxz
E1109 01:44:11.948634   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bgeneric-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1109 01:44:12.046082   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
(BSuccessful
message:apiVersion: extensions/v1beta1
kind: Deployment
... skipping 40 lines ...
deployment.apps "nginx" deleted
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E1109 01:44:12.763304   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1109 01:44:12.860397   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E1109 01:44:12.949867   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E1109 01:44:13.047070   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
I1109 01:44:13.510002   54890 namespace_controller.go:185] Namespace has been deleted non-native-resources
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I1109 01:44:13.728977   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263849-3277", Name:"busybox0", UID:"7b9ee07d-c46c-4d75-bd0a-23dde4753836", APIVersion:"v1", ResourceVersion:"1074", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-trhfw
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1109 01:44:13.734231   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263849-3277", Name:"busybox1", UID:"92fbf2e8-7923-42a2-acdc-d133d62e4201", APIVersion:"v1", ResourceVersion:"1076", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-gpz9z
E1109 01:44:13.764635   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1109 01:44:13.861490   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1109 01:44:13.950942   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE1109 01:44:14.048395   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE1109 01:44:14.765683   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE1109 01:44:14.862825   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:14.952149   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E1109 01:44:15.049448   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI1109 01:44:15.389211   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263849-3277", Name:"busybox0", UID:"7b9ee07d-c46c-4d75-bd0a-23dde4753836", APIVersion:"v1", ResourceVersion:"1098", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-kpdwn
I1109 01:44:15.399290   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263849-3277", Name:"busybox1", UID:"92fbf2e8-7923-42a2-acdc-d133d62e4201", APIVersion:"v1", ResourceVersion:"1104", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-7wjvn
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(Bgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1109 01:44:15.766627   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E1109 01:44:15.863964   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:44:15.953383   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:16.050665   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment created
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1109 01:44:16.078646   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263849-3277", Name:"nginx1-deployment", UID:"b5ceaac6-99b1-419c-986c-86141b3eeb57", APIVersion:"apps/v1", ResourceVersion:"1120", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
I1109 01:44:16.084632   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-3277", Name:"nginx1-deployment-7bdbbfb5cf", UID:"2f825d2b-8a6e-4049-94f1-c1a9ab0e687e", APIVersion:"apps/v1", ResourceVersion:"1121", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-wrlbc
I1109 01:44:16.084635   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263849-3277", Name:"nginx0-deployment", UID:"945901cd-3ae0-4f4f-a7a8-acecb79d50fd", APIVersion:"apps/v1", ResourceVersion:"1122", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I1109 01:44:16.091279   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-3277", Name:"nginx0-deployment-57c6bff7f6", UID:"679f199d-7468-4970-9690-0b9e009788eb", APIVersion:"apps/v1", ResourceVersion:"1125", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-mnfjz
I1109 01:44:16.091324   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-3277", Name:"nginx1-deployment-7bdbbfb5cf", UID:"2f825d2b-8a6e-4049-94f1-c1a9ab0e687e", APIVersion:"apps/v1", ResourceVersion:"1121", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-6vh8f
I1109 01:44:16.094144   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263849-3277", Name:"nginx0-deployment-57c6bff7f6", UID:"679f199d-7468-4970-9690-0b9e009788eb", APIVersion:"apps/v1", ResourceVersion:"1125", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-7nkm7
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment resumed
deployment.apps/nginx0-deployment resumed
E1109 01:44:16.767931   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:410: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E1109 01:44:16.865242   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E1109 01:44:16.954478   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E1109 01:44:17.051862   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:17.769258   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:17.866512   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:17.955886   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:18.053030   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I1109 01:44:18.276139   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263849-3277", Name:"busybox0", UID:"8960429b-5c76-49d3-9eca-96d190af7287", APIVersion:"v1", ResourceVersion:"1170", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-mxn5c
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1109 01:44:18.283118   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263849-3277", Name:"busybox1", UID:"36e176da-54e1-4c43-9855-6def8390fd42", APIVersion:"v1", ResourceVersion:"1172", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-wv8wt
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
... skipping 2 lines ...
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E1109 01:44:18.770562   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:18.867738   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:18.957072   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:19.054116   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
E1109 01:44:19.772113   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [1109 01:44:19] Testing kubectl(v1:namespaces)
namespace/my-namespace created
E1109 01:44:19.869011   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1308: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BE1109 01:44:19.958291   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "my-namespace" deleted
E1109 01:44:20.055347   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:20.773372   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:20.870212   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:20.959573   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:21.056691   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:21.774525   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:21.871369   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:21.960957   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:22.057869   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:22.775798   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:22.872547   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:22.962393   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:23.059198   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:23.777261   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:23.873873   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:23.963857   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:24.060598   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:24.778662   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:24.875125   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:24.965080   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:25.061319   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1317: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
... skipping 29 lines ...
namespace "namespace-1573263810-30492" deleted
namespace "namespace-1573263811-12379" deleted
namespace "namespace-1573263813-28823" deleted
namespace "namespace-1573263814-29760" deleted
namespace "namespace-1573263849-26674" deleted
namespace "namespace-1573263849-3277" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1573263715-16568" deleted
... skipping 27 lines ...
namespace "namespace-1573263810-30492" deleted
namespace "namespace-1573263811-12379" deleted
namespace "namespace-1573263813-28823" deleted
namespace "namespace-1573263814-29760" deleted
namespace "namespace-1573263849-26674" deleted
namespace "namespace-1573263849-3277" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
core.sh:1329: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1333: Successful get namespaces/other {{.metadata.name}}: other
(BE1109 01:44:25.779974   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1337: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:44:25.876384   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:25.966319   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E1109 01:44:26.062583   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1341: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1343: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1350: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1354: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
E1109 01:44:26.781218   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:26.877633   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:26.967372   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:27.063760   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:27.101848   54890 shared_informer.go:197] Waiting for caches to sync for resource quota
I1109 01:44:27.101910   54890 shared_informer.go:204] Caches are synced for resource quota 
I1109 01:44:27.619676   54890 shared_informer.go:197] Waiting for caches to sync for garbage collector
I1109 01:44:27.619747   54890 shared_informer.go:204] Caches are synced for garbage collector 
E1109 01:44:27.782615   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:27.879019   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:27.968647   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:28.065029   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:28.783838   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:28.880339   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:28.969876   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:29.066398   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:29.171623   54890 horizontal.go:341] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1573263849-3277
I1109 01:44:29.174588   54890 horizontal.go:341] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1573263849-3277
E1109 01:44:29.785186   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:29.881711   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:29.971231   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:30.067699   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:30.786880   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:30.884072   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:30.974688   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:31.068633   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_secrets_test
Running command: run_secrets_test

+++ Running case: test-cmd.run_secrets_test 
E1109 01:44:31.788261   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_secrets_test
+++ [1109 01:44:31] Creating namespace namespace-1573263871-2284
namespace/namespace-1573263871-2284 created
E1109 01:44:31.885185   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1109 01:44:31] Testing secrets
E1109 01:44:31.976058   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:32.021278   71695 loader.go:375] Config loaded from file:  /tmp/tmp.JlPNjhJbR0/.kube/config
Successful
message:apiVersion: v1
data:
  key1: dmFsdWUx
kind: Secret
... skipping 25 lines ...
  key1: dmFsdWUx
kind: Secret
metadata:
  creationTimestamp: null
  name: test
has not:example.com
E1109 01:44:32.069927   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(Bnamespace/test-secrets created
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(Bcore.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(Bsecret "test-secret" deleted
E1109 01:44:32.789510   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:44:32.886525   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E1109 01:44:32.977201   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE1109 01:44:33.071138   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(Bsecret "test-secret" deleted
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
secret/test-secret created
E1109 01:44:33.790777   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE1109 01:44:33.887847   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(BE1109 01:44:33.978484   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E1109 01:44:34.072483   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/secret-string-data created
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(Bsecret "secret-string-data" deleted
core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret "test-secret" deleted
namespace "test-secrets" deleted
E1109 01:44:34.791938   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:34.889038   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:34.979702   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:35.073741   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:35.174710   54890 namespace_controller.go:185] Namespace has been deleted my-namespace
I1109 01:44:35.622720   54890 namespace_controller.go:185] Namespace has been deleted kube-node-lease
I1109 01:44:35.651947   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263715-16568
I1109 01:44:35.652971   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263737-26290
I1109 01:44:35.655401   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263718-5402
I1109 01:44:35.659979   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263733-20389
I1109 01:44:35.660022   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263723-24542
I1109 01:44:35.671208   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263732-16694
I1109 01:44:35.675504   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263737-28838
I1109 01:44:35.684872   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263729-5729
I1109 01:44:35.686279   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263736-6909
E1109 01:44:35.793312   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:35.832215   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263747-12697
I1109 01:44:35.863248   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263759-15691
I1109 01:44:35.863414   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263758-20248
I1109 01:44:35.874809   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263747-23672
I1109 01:44:35.880257   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263766-7206
I1109 01:44:35.884848   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263761-22500
I1109 01:44:35.889524   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263762-18985
E1109 01:44:35.890332   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:35.891679   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263763-23248
I1109 01:44:35.899684   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263765-11776
I1109 01:44:35.949212   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263768-7376
E1109 01:44:35.981152   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:36.063554   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263771-25656
E1109 01:44:36.075054   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:36.081122   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263789-20918
I1109 01:44:36.114171   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263790-26152
I1109 01:44:36.115798   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263805-16762
I1109 01:44:36.124871   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263790-10806
I1109 01:44:36.124887   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263799-24208
I1109 01:44:36.124907   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263810-20592
... skipping 3 lines ...
I1109 01:44:36.253196   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263811-12379
I1109 01:44:36.263723   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263813-28823
I1109 01:44:36.291304   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263814-29760
I1109 01:44:36.294865   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263849-26674
I1109 01:44:36.355169   54890 namespace_controller.go:185] Namespace has been deleted namespace-1573263849-3277
I1109 01:44:36.702563   54890 namespace_controller.go:185] Namespace has been deleted other
E1109 01:44:36.794635   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:36.891684   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:36.982497   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:37.076421   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:37.795880   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:37.893052   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:37.983785   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:38.077754   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:38.797353   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:38.894277   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:38.985097   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:39.079041   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:39.798063   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E1109 01:44:39.895581   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
+++ [1109 01:44:39] Creating namespace namespace-1573263879-28153
E1109 01:44:39.986401   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1573263879-28153 created
E1109 01:44:40.080489   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1109 01:44:40] Testing configmaps
configmap/test-configmap created
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(Bconfigmap "test-configmap" deleted
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(Bnamespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(BE1109 01:44:40.799101   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bcore.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(BE1109 01:44:40.896837   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-configmap created
E1109 01:44:40.987712   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-binary-configmap created
E1109 01:44:41.081786   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(Bconfigmap "test-configmap" deleted
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E1109 01:44:41.800487   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:41.898066   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:41.988952   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:42.083104   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:42.801897   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:42.899331   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:42.990322   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:43.084664   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:43.803076   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:43.900688   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:43.991727   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:44.086154   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:44.804683   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:44.857989   54890 namespace_controller.go:185] Namespace has been deleted test-secrets
E1109 01:44:44.901998   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:44.993235   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:45.087833   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:45.805921   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:45.903463   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:45.994610   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:46.089502   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests
E1109 01:44:46.807139   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [1109 01:44:46] Creating namespace namespace-1573263886-26454
E1109 01:44:46.904954   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1573263886-26454 created
Context "test" modified.
+++ [1109 01:44:46] Testing client config
E1109 01:44:46.995944   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E1109 01:44:47.090919   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [1109 01:44:47] Creating namespace namespace-1573263887-9129
namespace/namespace-1573263887-9129 created
E1109 01:44:47.808404   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1109 01:44:47] Testing service accounts
E1109 01:44:47.906241   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:828: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-service-accounts\" }}found{{end}}{{end}}:: :
(Bnamespace/test-service-accounts created
E1109 01:44:47.996993   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(BE1109 01:44:48.092137   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount/test-service-account created
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E1109 01:44:48.810002   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:48.907567   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:48.998554   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:49.093680   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:49.811251   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:49.909005   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:50.000157   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:50.095131   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:50.812811   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:50.910394   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:51.001570   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:51.096709   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:51.747042   54890 namespace_controller.go:185] Namespace has been deleted test-configmaps
E1109 01:44:51.814044   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:51.911761   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:52.002993   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:52.098471   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:52.815606   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:52.913192   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:53.004357   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:53.099835   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_job_tests
+++ [1109 01:44:53] Creating namespace namespace-1573263893-16329
namespace/namespace-1573263893-16329 created
Context "test" modified.
+++ [1109 01:44:53] Testing job
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(BE1109 01:44:53.816881   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-jobs created
E1109 01:44:53.914468   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
(BE1109 01:44:54.005541   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/pi created
E1109 01:44:54.101233   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:39: Successful get cronjob/pi --namespace=test-jobs {{.metadata.name}}: pi
(BNAME   SCHEDULE       SUSPEND   ACTIVE   LAST SCHEDULE   AGE
pi     59 23 31 2 *   False     0        <none>          0s
Name:                          pi
Namespace:                     test-jobs
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 36 lines ...
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Sat, 09 Nov 2019 01:44:54 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=9476d936-29ca-4ae4-9869-a41eb371fbc2
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 12 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  0s    job-controller  Created pod: test-job-b74gn
E1109 01:44:54.818131   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job" deleted
E1109 01:44:54.915581   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "pi" deleted
namespace "test-jobs" deleted
E1109 01:44:55.006467   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:55.102466   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:55.819529   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:55.916815   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:56.007792   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:56.103660   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:56.820753   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:56.918037   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:57.009168   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:57.105011   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:57.821768   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:57.919605   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:58.010470   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:58.106368   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:44:58.482759   54890 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E1109 01:44:58.823064   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:58.921064   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:59.011930   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:59.107811   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:59.824391   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:44:59.922467   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:00.012790   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:00.109206   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 6 lines ...
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(Bjob.batch "test-job" deleted
I1109 01:45:00.650080   54890 event.go:281] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1573263900-4665", Name:"test-job-pi", UID:"a0aebcae-4cbb-40f6-bd62-bc24980df09c", APIVersion:"batch/v1", ResourceVersion:"1547", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-n5wsn
job.batch/test-job-pi created
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(Bjob.batch "test-job-pi" deleted
E1109 01:45:00.825164   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/test-pi created
E1109 01:45:00.924171   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:45:00.981890   54890 event.go:281] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1573263900-4665", Name:"my-pi", UID:"c182635e-2923-46e3-8f11-aa0816bdc63f", APIVersion:"batch/v1", ResourceVersion:"1555", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-92fng
job.batch/my-pi created
E1109 01:45:01.019583   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:[perl -Mbignum=bpi -wle print bpi(10)]
has:perl -Mbignum=bpi -wle print bpi(10)
E1109 01:45:01.110542   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "my-pi" deleted
cronjob.batch "test-pi" deleted
+++ exit code: 0
Recording: run_pod_templates_tests
Running command: run_pod_templates_tests

... skipping 4 lines ...
namespace/namespace-1573263901-18292 created
Context "test" modified.
+++ [1109 01:45:01] Testing pod templates
core.sh:1415: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BI1109 01:45:01.772641   51457 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
E1109 01:45:01.826523   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1419: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE1109 01:45:01.925378   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
E1109 01:45:02.020809   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:02.111835   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1427: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
core.sh:1431: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_service_tests
Running command: run_service_tests
... skipping 3 lines ...
+++ command: run_service_tests
Context "test" modified.
+++ [1109 01:45:02] Testing kubectl(v1:services)
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE1109 01:45:02.827745   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
E1109 01:45:02.926631   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Port:
matched Endpoints:
matched Session Affinity:
core.sh:864: Successful describe services redis-master:
Name:              redis-master
Namespace:         default
... skipping 6 lines ...
IP:                10.0.0.196
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE1109 01:45:03.021819   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:866: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 4 lines ...
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(B
E1109 01:45:03.113378   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:868: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 162 lines ...
  - port: 6379
    targetPort: 6379
  selector:
    role: padawan
status:
  loadBalancer: {}
E1109 01:45:03.828927   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2019-11-09T01:45:02Z"
  labels:
    app: redis
... skipping 14 lines ...
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
service/redis-master selector updated
E1109 01:45:03.927603   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:890: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: padawan:
(BE1109 01:45:04.023009   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
E1109 01:45:04.114697   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:894: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BapiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2019-11-09T01:45:02Z"
  labels:
... skipping 14 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(Bcore.sh:905: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:912: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE1109 01:45:04.830236   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:916: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE1109 01:45:04.928811   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E1109 01:45:05.024272   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:45:05.093438   54890 namespace_controller.go:185] Namespace has been deleted test-jobs
core.sh:920: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE1109 01:45:05.115962   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:924: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice/service-v1-test created
core.sh:945: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice/service-v1-test replaced
core.sh:952: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
E1109 01:45:05.831467   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "service-v1-test" deleted
E1109 01:45:05.929923   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:960: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE1109 01:45:06.025447   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:964: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE1109 01:45:06.116974   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
service/redis-slave created
core.sh:969: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BSuccessful
message:NAME           RSRC
kubernetes     146
redis-master   1595
redis-slave    1599
has:redis-master
core.sh:979: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(Bservice "redis-master" deleted
service "redis-slave" deleted
core.sh:986: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE1109 01:45:06.832726   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:990: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE1109 01:45:06.931066   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/beep-boop created
E1109 01:45:07.027177   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:994: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(BE1109 01:45:07.118278   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:998: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bservice "beep-boop" deleted
core.sh:1005: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1009: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bkubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I1109 01:45:07.487307   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"3b3f44bb-623b-427d-be5e-a08d8f87b4f9", APIVersion:"apps/v1", ResourceVersion:"1618", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I1109 01:45:07.495110   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"ca16b63a-9dbb-48ae-b529-e369c9985502", APIVersion:"apps/v1", ResourceVersion:"1619", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-xbckk
I1109 01:45:07.498337   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"ca16b63a-9dbb-48ae-b529-e369c9985502", APIVersion:"apps/v1", ResourceVersion:"1619", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-kpqf5
service/testmetadata created
deployment.apps/testmetadata created
core.sh:1013: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(Bcore.sh:1014: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(Bservice/exposemetadata exposed
E1109 01:45:07.834066   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1020: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(BE1109 01:45:07.932295   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "exposemetadata" deleted
service "testmetadata" deleted
E1109 01:45:08.028237   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "testmetadata" deleted
+++ exit code: 0
Recording: run_daemonset_tests
Running command: run_daemonset_tests

+++ Running case: test-cmd.run_daemonset_tests 
E1109 01:45:08.119459   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_tests
+++ [1109 01:45:08] Creating namespace namespace-1573263908-29927
namespace/namespace-1573263908-29927 created
Context "test" modified.
+++ [1109 01:45:08] Testing kubectl(v1:daemonsets)
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI1109 01:45:08.520585   51457 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I1109 01:45:08.531619   51457 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(Bdaemonset.apps/bind configured
E1109 01:45:08.835341   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE1109 01:45:08.933452   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind image updated
E1109 01:45:09.029530   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(BE1109 01:45:09.120608   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind env updated
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(Bdaemonset.apps/bind resource requirements updated
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(Bdaemonset.apps/bind restarted
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
... skipping 4 lines ...

+++ Running case: test-cmd.run_daemonset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_history_tests
+++ [1109 01:45:09] Creating namespace namespace-1573263909-20642
namespace/namespace-1573263909-20642 created
E1109 01:45:09.836773   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1109 01:45:09] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
E1109 01:45:09.934844   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:10.030847   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind created
E1109 01:45:10.121795   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1573263909-20642"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind skipped rollback (current template already matches revision 1)
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind configured
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE1109 01:45:10.837886   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1109 01:45:10.936170   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE1109 01:45:11.032009   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1573263909-20642"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1573263909-20642"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE1109 01:45:11.122891   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind will roll back to Pod Template:
  Labels:	service=bind
  Containers:
   kubernetes-pause:
    Image:	k8s.gcr.io/pause:2.0
    Port:	<none>
... skipping 6 lines ...
(Bapps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
E1109 01:45:11.839308   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE1109 01:45:11.937399   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE1109 01:45:12.033331   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E1109 01:45:12.113658   54890 daemon_controller.go:290] namespace-1573263909-20642/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1573263909-20642", SelfLink:"/apis/apps/v1/namespaces/namespace-1573263909-20642/daemonsets/bind", UID:"8e2da9d6-ecc0-41b9-a742-c8a30bedd730", ResourceVersion:"1693", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63708860710, loc:(*time.Location)(0x6ba3200)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1573263909-20642\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc000ae90e0), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc001db7a68), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0017f39e0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc000ae9100), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc0007aa398)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc001db7abc)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
E1109 01:45:12.123943   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
... skipping 4 lines ...
+++ command: run_rc_tests
+++ [1109 01:45:12] Creating namespace namespace-1573263912-9222
namespace/namespace-1573263912-9222 created
Context "test" modified.
+++ [1109 01:45:12] Testing kubectl(v1:replicationcontrollers)
core.sh:1046: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:12.840700   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I1109 01:45:12.920624   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"62532f83-ff7c-4981-8ce6-29cd206f0c93", APIVersion:"v1", ResourceVersion:"1703", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wqt8m
I1109 01:45:12.923237   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"62532f83-ff7c-4981-8ce6-29cd206f0c93", APIVersion:"v1", ResourceVersion:"1703", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qs9sc
I1109 01:45:12.926001   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"62532f83-ff7c-4981-8ce6-29cd206f0c93", APIVersion:"v1", ResourceVersion:"1703", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-vq7rc
E1109 01:45:12.938250   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
E1109 01:45:13.034597   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1051: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:13.125057   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1055: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I1109 01:45:13.337995   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"10522ebb-5181-4319-bcff-adbd6f67d44e", APIVersion:"v1", ResourceVersion:"1719", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tvndv
I1109 01:45:13.340738   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"10522ebb-5181-4319-bcff-adbd6f67d44e", APIVersion:"v1", ResourceVersion:"1719", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fr726
I1109 01:45:13.343532   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"10522ebb-5181-4319-bcff-adbd6f67d44e", APIVersion:"v1", ResourceVersion:"1719", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qj6tz
core.sh:1059: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
... skipping 10 lines ...
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 4 lines ...
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(B
E1109 01:45:13.841890   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1067: Successful describe
Name:         frontend
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-tvndv
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-fr726
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-qj6tz
(B
E1109 01:45:13.939276   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
... skipping 5 lines ...
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-tvndv
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-fr726
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-qj6tz
(BE1109 01:45:14.035799   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-tvndv
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-fr726
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-qj6tz
(BE1109 01:45:14.126248   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1573263912-9222
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1079: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E1109 01:45:14.439607   54890 replica_set.go:202] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1573263912-9222 /api/v1/namespaces/namespace-1573263912-9222/replicationcontrollers/frontend 10522ebb-5181-4319-bcff-adbd6f67d44e 1728 2 2019-11-09 01:45:13 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0014ebdc8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I1109 01:45:14.444720   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"10522ebb-5181-4319-bcff-adbd6f67d44e", APIVersion:"v1", ResourceVersion:"1728", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-qj6tz
core.sh:1083: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1087: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
core.sh:1091: Successful get rc frontend {{.spec.replicas}}: 2
(BE1109 01:45:14.843129   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1095: Successful get rc frontend {{.spec.replicas}}: 2
(BE1109 01:45:14.940564   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
I1109 01:45:14.949543   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"10522ebb-5181-4319-bcff-adbd6f67d44e", APIVersion:"v1", ResourceVersion:"1736", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hpx42
E1109 01:45:15.036973   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1099: Successful get rc frontend {{.spec.replicas}}: 3
(BE1109 01:45:15.127502   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1103: Successful get rc frontend {{.spec.replicas}}: 3
(BE1109 01:45:15.204610   54890 replica_set.go:202] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1573263912-9222 /api/v1/namespaces/namespace-1573263912-9222/replicationcontrollers/frontend 10522ebb-5181-4319-bcff-adbd6f67d44e 1741 4 2019-11-09 01:45:13 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc000aa7c58 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
replicationcontroller/frontend scaled
I1109 01:45:15.215579   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"10522ebb-5181-4319-bcff-adbd6f67d44e", APIVersion:"v1", ResourceVersion:"1741", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-hpx42
core.sh:1107: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
... skipping 6 lines ...
I1109 01:45:15.789527   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"redis-master", UID:"6f5f613e-d989-492c-bb0d-b3a0a4831c56", APIVersion:"v1", ResourceVersion:"1764", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-zn7d6
replicationcontroller/redis-slave scaled
I1109 01:45:15.794146   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"redis-master", UID:"6f5f613e-d989-492c-bb0d-b3a0a4831c56", APIVersion:"v1", ResourceVersion:"1764", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-pk2jx
I1109 01:45:15.794187   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"redis-master", UID:"6f5f613e-d989-492c-bb0d-b3a0a4831c56", APIVersion:"v1", ResourceVersion:"1764", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-hk4rm
I1109 01:45:15.795984   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"redis-slave", UID:"efa35c58-f82e-4e3f-a87b-f27c60c712f3", APIVersion:"v1", ResourceVersion:"1766", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-bwx98
I1109 01:45:15.803941   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"redis-slave", UID:"efa35c58-f82e-4e3f-a87b-f27c60c712f3", APIVersion:"v1", ResourceVersion:"1766", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-j588x
E1109 01:45:15.844382   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1117: Successful get rc redis-master {{.spec.replicas}}: 4
(BE1109 01:45:15.941861   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1118: Successful get rc redis-slave {{.spec.replicas}}: 4
(BE1109 01:45:16.038057   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "redis-master" deleted
replicationcontroller "redis-slave" deleted
E1109 01:45:16.128573   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1109 01:45:16.228746   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment", UID:"24568e3e-5e5c-4dd9-8c37-6968dde99d1a", APIVersion:"apps/v1", ResourceVersion:"1798", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1109 01:45:16.231630   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-6986c7bc94", UID:"2717466a-fd1b-4eb6-8ec5-027a550b14fe", APIVersion:"apps/v1", ResourceVersion:"1799", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-x4tnt
I1109 01:45:16.234108   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-6986c7bc94", UID:"2717466a-fd1b-4eb6-8ec5-027a550b14fe", APIVersion:"apps/v1", ResourceVersion:"1799", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-bwmc8
I1109 01:45:16.237565   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-6986c7bc94", UID:"2717466a-fd1b-4eb6-8ec5-027a550b14fe", APIVersion:"apps/v1", ResourceVersion:"1799", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-8gdb8
deployment.apps/nginx-deployment scaled
... skipping 4 lines ...
(Bdeployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
E1109 01:45:16.845693   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1109 01:45:16.896046   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment", UID:"7f8f54e4-3827-489b-ba40-5a2ca1b16cc8", APIVersion:"apps/v1", ResourceVersion:"1840", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1109 01:45:16.900909   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-6986c7bc94", UID:"565e492a-47cb-4027-9a72-4d2e824f1f23", APIVersion:"apps/v1", ResourceVersion:"1841", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-zqxm2
I1109 01:45:16.903819   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-6986c7bc94", UID:"565e492a-47cb-4027-9a72-4d2e824f1f23", APIVersion:"apps/v1", ResourceVersion:"1841", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-wk4n5
I1109 01:45:16.906596   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-6986c7bc94", UID:"565e492a-47cb-4027-9a72-4d2e824f1f23", APIVersion:"apps/v1", ResourceVersion:"1841", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-wbq2n
E1109 01:45:16.942999   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1146: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(BE1109 01:45:17.039243   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/nginx-deployment exposed
E1109 01:45:17.129767   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1150: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(Bdeployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
replicationcontroller/frontend created
I1109 01:45:17.408879   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"7e990175-4b45-444e-b965-c5b2fb0a6bd2", APIVersion:"v1", ResourceVersion:"1870", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fc2zr
I1109 01:45:17.414174   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"7e990175-4b45-444e-b965-c5b2fb0a6bd2", APIVersion:"v1", ResourceVersion:"1870", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gfqql
I1109 01:45:17.414682   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"7e990175-4b45-444e-b965-c5b2fb0a6bd2", APIVersion:"v1", ResourceVersion:"1870", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kpd4c
core.sh:1157: Successful get rc frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
core.sh:1161: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
E1109 01:45:17.846804   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1165: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(BE1109 01:45:17.944127   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E1109 01:45:18.040566   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-3 exposed
E1109 01:45:18.130899   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1170: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(Bservice/frontend-4 exposed
core.sh:1174: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice/frontend-5 exposed
core.sh:1178: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
E1109 01:45:18.848062   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
E1109 01:45:18.945464   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
E1109 01:45:19.041743   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
E1109 01:45:19.132201   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/etcd-server exposed
has:etcd-server exposed
core.sh:1208: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(Bcore.sh:1209: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(Bservice "etcd-server" deleted
core.sh:1215: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicationcontroller "frontend" deleted
core.sh:1219: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1223: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:19.849297   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:19.946728   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I1109 01:45:19.967395   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"b3c3e7db-a55f-4346-85d2-450b60dc6c87", APIVersion:"v1", ResourceVersion:"1950", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9976m
I1109 01:45:19.969512   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"b3c3e7db-a55f-4346-85d2-450b60dc6c87", APIVersion:"v1", ResourceVersion:"1950", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-d5hn5
I1109 01:45:19.973023   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"b3c3e7db-a55f-4346-85d2-450b60dc6c87", APIVersion:"v1", ResourceVersion:"1950", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fhcc9
E1109 01:45:20.043062   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
E1109 01:45:20.133051   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:45:20.133841   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"redis-slave", UID:"587b9bf1-b69f-4244-a014-eef48d989a81", APIVersion:"v1", ResourceVersion:"1959", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-nvd89
I1109 01:45:20.140838   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"redis-slave", UID:"587b9bf1-b69f-4244-a014-eef48d989a81", APIVersion:"v1", ResourceVersion:"1959", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-88x7r
core.sh:1228: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bcore.sh:1232: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
core.sh:1236: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1240: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I1109 01:45:20.740845   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"c5bfb27b-0752-4a51-baae-83f91717aff6", APIVersion:"v1", ResourceVersion:"1981", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nkg7s
I1109 01:45:20.744089   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"c5bfb27b-0752-4a51-baae-83f91717aff6", APIVersion:"v1", ResourceVersion:"1981", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5fqvq
I1109 01:45:20.744523   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263912-9222", Name:"frontend", UID:"c5bfb27b-0752-4a51-baae-83f91717aff6", APIVersion:"v1", ResourceVersion:"1981", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-7z5hh
core.sh:1243: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE1109 01:45:20.850565   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E1109 01:45:20.948052   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1246: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(BE1109 01:45:21.044491   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E1109 01:45:21.134244   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1250: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set


Examples:
  # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
  kubectl autoscale deployment foo --min=2 --max=10
  
... skipping 54 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
E1109 01:45:21.851657   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources created
I1109 01:45:21.940872   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources", UID:"a4e2e012-f224-4b17-bf62-f287f8e3459d", APIVersion:"apps/v1", ResourceVersion:"2001", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I1109 01:45:21.943894   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-67f8cfff5", UID:"32114d14-7fbb-4f96-831c-0df92ccbcb47", APIVersion:"apps/v1", ResourceVersion:"2002", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-g5bbs
I1109 01:45:21.948008   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-67f8cfff5", UID:"32114d14-7fbb-4f96-831c-0df92ccbcb47", APIVersion:"apps/v1", ResourceVersion:"2002", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-nzxlg
I1109 01:45:21.948167   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-67f8cfff5", UID:"32114d14-7fbb-4f96-831c-0df92ccbcb47", APIVersion:"apps/v1", ResourceVersion:"2002", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-qns7b
E1109 01:45:21.949308   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(BE1109 01:45:22.045886   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1266: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1109 01:45:22.135406   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1267: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I1109 01:45:22.300717   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources", UID:"a4e2e012-f224-4b17-bf62-f287f8e3459d", APIVersion:"apps/v1", ResourceVersion:"2015", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I1109 01:45:22.303123   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-55c547f795", UID:"75a00dcf-38be-4e32-bca3-bfbdcddfe59d", APIVersion:"apps/v1", ResourceVersion:"2016", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-mgsft
core.sh:1270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(Bcore.sh:1271: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I1109 01:45:22.648169   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources", UID:"a4e2e012-f224-4b17-bf62-f287f8e3459d", APIVersion:"apps/v1", ResourceVersion:"2025", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
I1109 01:45:22.655303   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-67f8cfff5", UID:"32114d14-7fbb-4f96-831c-0df92ccbcb47", APIVersion:"apps/v1", ResourceVersion:"2029", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-g5bbs
I1109 01:45:22.655815   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources", UID:"a4e2e012-f224-4b17-bf62-f287f8e3459d", APIVersion:"apps/v1", ResourceVersion:"2028", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I1109 01:45:22.657717   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-6d86564b45", UID:"10207645-b5ea-44c5-9fd0-4a48541119ea", APIVersion:"apps/v1", ResourceVersion:"2032", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-c4mrb
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(BE1109 01:45:22.852833   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I1109 01:45:22.915881   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources", UID:"a4e2e012-f224-4b17-bf62-f287f8e3459d", APIVersion:"apps/v1", ResourceVersion:"2048", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 1
I1109 01:45:22.922120   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources", UID:"a4e2e012-f224-4b17-bf62-f287f8e3459d", APIVersion:"apps/v1", ResourceVersion:"2051", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
I1109 01:45:22.924486   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-67f8cfff5", UID:"32114d14-7fbb-4f96-831c-0df92ccbcb47", APIVersion:"apps/v1", ResourceVersion:"2052", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-nzxlg
I1109 01:45:22.927243   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263912-9222", Name:"nginx-deployment-resources-6c478d4fdb", UID:"837df410-bf00-4681-be1d-513c27489ac9", APIVersion:"apps/v1", ResourceVersion:"2054", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-b5cnl
E1109 01:45:22.950240   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1280: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE1109 01:45:23.047022   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1281: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE1109 01:45:23.136584   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BapiVersion: apps/v1
kind: Deployment
metadata:
  annotations:
    deployment.kubernetes.io/revision: "4"
... skipping 66 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 4 lines ...

+++ Running case: test-cmd.run_deployment_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_deployment_tests
+++ [1109 01:45:23] Creating namespace namespace-1573263923-9075
namespace/namespace-1573263923-9075 created
E1109 01:45:23.853845   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1109 01:45:23] Testing deployments
deployment.apps/test-nginx-extensions created
E1109 01:45:23.951648   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:45:23.951707   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"test-nginx-extensions", UID:"f7ccda0f-f00e-41bd-8b2d-47a72f3530f7", APIVersion:"apps/v1", ResourceVersion:"2084", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
I1109 01:45:23.957268   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"test-nginx-extensions-5559c76db7", UID:"4b2d487c-49ee-44ba-a4ea-b8a066eb5328", APIVersion:"apps/v1", ResourceVersion:"2085", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-qm2vn
apps.sh:185: Successful get deploy test-nginx-extensions {{(index .spec.template.spec.containers 0).name}}: nginx
(BE1109 01:45:24.048132   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has not:2
E1109 01:45:24.137624   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apps/v1
has:apps/v1
deployment.apps "test-nginx-extensions" deleted
deployment.apps/test-nginx-apps created
I1109 01:45:24.386217   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"test-nginx-apps", UID:"b5f038eb-a090-441e-8c90-e3056f376a96", APIVersion:"apps/v1", ResourceVersion:"2098", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
... skipping 21 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 33 lines ...
    Mounts:       <none>
Volumes:          <none>
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
(BE1109 01:45:24.855078   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-nginx-apps" deleted
E1109 01:45:24.953008   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:25.049361   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-with-command created
I1109 01:45:25.077394   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-with-command", UID:"20b34e2a-b8d8-405f-9958-e97bcd98b3f1", APIVersion:"apps/v1", ResourceVersion:"2114", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I1109 01:45:25.080405   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-with-command-757c6f58dd", UID:"b9fb5d85-cac4-4176-9872-846b38d55a34", APIVersion:"apps/v1", ResourceVersion:"2115", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-s2f5h
E1109 01:45:25.139028   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(Bdeployment.apps "nginx-with-command" deleted
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/deployment-with-unixuserid created
I1109 01:45:25.517696   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"deployment-with-unixuserid", UID:"533be577-ffc9-4fe2-9163-b47565e9d61e", APIVersion:"apps/v1", ResourceVersion:"2128", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I1109 01:45:25.520015   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"6f812013-5ffa-400a-87fc-4c44311ab668", APIVersion:"apps/v1", ResourceVersion:"2129", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-l27fl
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:25.856464   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1109 01:45:25.935925   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"2cd7b6ce-30dd-4835-a929-d3601b5b1e16", APIVersion:"apps/v1", ResourceVersion:"2142", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1109 01:45:25.939659   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-6986c7bc94", UID:"85f23df3-6ead-451d-962c-17b942b24c1c", APIVersion:"apps/v1", ResourceVersion:"2143", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-t9dww
I1109 01:45:25.943353   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-6986c7bc94", UID:"85f23df3-6ead-451d-962c-17b942b24c1c", APIVersion:"apps/v1", ResourceVersion:"2143", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-lcd8l
I1109 01:45:25.943863   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-6986c7bc94", UID:"85f23df3-6ead-451d-962c-17b942b24c1c", APIVersion:"apps/v1", ResourceVersion:"2143", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-qvbcr
E1109 01:45:25.954698   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(BE1109 01:45:26.050546   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E1109 01:45:26.140155   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I1109 01:45:26.441446   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"6cebf2c5-955f-48ec-a042-4571bcbb2b08", APIVersion:"apps/v1", ResourceVersion:"2164", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I1109 01:45:26.443812   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-7f6fc565b9", UID:"dd5ceb4f-7828-4ade-b8c8-fc0b3793b435", APIVersion:"apps/v1", ResourceVersion:"2165", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-ccs2b
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(BE1109 01:45:26.857811   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:26.955940   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "nginx-deployment-7f6fc565b9" deleted
E1109 01:45:27.051774   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:27.141470   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1109 01:45:27.247376   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"7a2f7dd5-58ff-4451-9639-6c38f0bda5fb", APIVersion:"apps/v1", ResourceVersion:"2184", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1109 01:45:27.249839   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-6986c7bc94", UID:"7bdbee10-d89e-4d2b-8a06-58237870899d", APIVersion:"apps/v1", ResourceVersion:"2185", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-m2r8j
I1109 01:45:27.252329   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-6986c7bc94", UID:"7bdbee10-d89e-4d2b-8a06-58237870899d", APIVersion:"apps/v1", ResourceVersion:"2185", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-f948b
I1109 01:45:27.253784   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-6986c7bc94", UID:"7bdbee10-d89e-4d2b-8a06-58237870899d", APIVersion:"apps/v1", ResourceVersion:"2185", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-j6rwb
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bhorizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
deployment.apps "nginx-deployment" deleted
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:27.859112   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:27.957108   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I1109 01:45:27.972765   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx", UID:"679ea2ae-768f-4213-9415-349d92649260", APIVersion:"apps/v1", ResourceVersion:"2208", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I1109 01:45:27.977210   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-f87d999f7", UID:"c15b5fdf-d339-4c57-8e52-bf4eb5795679", APIVersion:"apps/v1", ResourceVersion:"2209", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-b5pph
I1109 01:45:27.980911   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-f87d999f7", UID:"c15b5fdf-d339-4c57-8e52-bf4eb5795679", APIVersion:"apps/v1", ResourceVersion:"2209", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-khsp8
I1109 01:45:27.981222   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-f87d999f7", UID:"c15b5fdf-d339-4c57-8e52-bf4eb5795679", APIVersion:"apps/v1", ResourceVersion:"2209", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-dwmww
E1109 01:45:28.053120   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE1109 01:45:28.142744   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BWarning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I1109 01:45:28.532519   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx", UID:"679ea2ae-768f-4213-9415-349d92649260", APIVersion:"apps/v1", ResourceVersion:"2222", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I1109 01:45:28.535118   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-78487f9fd7", UID:"2d84d6d8-541f-4df1-b92a-b01fe0563a52", APIVersion:"apps/v1", ResourceVersion:"2223", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-fpszd
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE1109 01:45:28.860483   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
E1109 01:45:28.958224   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:29.054744   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:29.144067   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:29.861829   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:29.959534   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1109 01:45:30.056278   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find specified revision 1000000 in history
E1109 01:45:30.145441   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
E1109 01:45:30.863067   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:30.960984   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:31.057622   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:31.146779   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
E1109 01:45:31.864332   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
E1109 01:45:31.961944   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:32.058868   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    deployment.kubernetes.io/revision-history: 1,3
E1109 01:45:32.148085   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I1109 01:45:32.416384   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx", UID:"679ea2ae-768f-4213-9415-349d92649260", APIVersion:"apps/v1", ResourceVersion:"2255", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-f87d999f7 to 2
I1109 01:45:32.424024   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx", UID:"679ea2ae-768f-4213-9415-349d92649260", APIVersion:"apps/v1", ResourceVersion:"2257", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-6d6dbb8b7d to 1
I1109 01:45:32.425267   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-f87d999f7", UID:"c15b5fdf-d339-4c57-8e52-bf4eb5795679", APIVersion:"apps/v1", ResourceVersion:"2259", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-f87d999f7-khsp8
I1109 01:45:32.430217   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-6d6dbb8b7d", UID:"b687320e-9623-41fa-992c-784b0bffcaa7", APIVersion:"apps/v1", ResourceVersion:"2262", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-6d6dbb8b7d-j8g87
E1109 01:45:32.865640   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:32.963417   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:33.060180   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:33.149545   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 54 lines ...
deployment.apps/nginx2 created
I1109 01:45:33.742482   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx2", UID:"da8cd2fa-3248-4abf-bea3-0a703721ec72", APIVersion:"apps/v1", ResourceVersion:"2277", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
I1109 01:45:33.746172   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx2-57b7865cd9", UID:"42fec14f-8dd1-4d52-900a-5bdac7161c92", APIVersion:"apps/v1", ResourceVersion:"2278", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-d8ck2
I1109 01:45:33.751728   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx2-57b7865cd9", UID:"42fec14f-8dd1-4d52-900a-5bdac7161c92", APIVersion:"apps/v1", ResourceVersion:"2278", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-qws9z
I1109 01:45:33.751986   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx2-57b7865cd9", UID:"42fec14f-8dd1-4d52-900a-5bdac7161c92", APIVersion:"apps/v1", ResourceVersion:"2278", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-rzfln
deployment.apps "nginx2" deleted
E1109 01:45:33.866820   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx" deleted
E1109 01:45:33.964597   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:34.061368   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:34.150844   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1109 01:45:34.166603   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"b54affc2-4bd3-4fdd-a519-51a903779849", APIVersion:"apps/v1", ResourceVersion:"2311", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I1109 01:45:34.173329   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"424377a6-79d5-408f-aab4-0083dd2b7840", APIVersion:"apps/v1", ResourceVersion:"2312", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-526h9
I1109 01:45:34.175792   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"424377a6-79d5-408f-aab4-0083dd2b7840", APIVersion:"apps/v1", ResourceVersion:"2312", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-5flnk
I1109 01:45:34.178470   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"424377a6-79d5-408f-aab4-0083dd2b7840", APIVersion:"apps/v1", ResourceVersion:"2312", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-llv4g
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I1109 01:45:34.526471   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"b54affc2-4bd3-4fdd-a519-51a903779849", APIVersion:"apps/v1", ResourceVersion:"2325", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I1109 01:45:34.531443   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-59df9b5f5b", UID:"1102130a-7950-410d-a110-2f751ac46a18", APIVersion:"apps/v1", ResourceVersion:"2326", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-2wb9g
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
E1109 01:45:34.867842   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
E1109 01:45:34.965709   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE1109 01:45:35.062675   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
E1109 01:45:35.151589   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bapps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I1109 01:45:35.657580   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"b54affc2-4bd3-4fdd-a519-51a903779849", APIVersion:"apps/v1", ResourceVersion:"2345", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I1109 01:45:35.664085   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"b54affc2-4bd3-4fdd-a519-51a903779849", APIVersion:"apps/v1", ResourceVersion:"2347", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I1109 01:45:35.666003   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"424377a6-79d5-408f-aab4-0083dd2b7840", APIVersion:"apps/v1", ResourceVersion:"2349", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-526h9
I1109 01:45:35.666857   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-7d758dbc54", UID:"736bab31-1773-4dde-81d0-a4a9dea2067f", APIVersion:"apps/v1", ResourceVersion:"2352", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-t26g2
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1109 01:45:35.869048   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:45:35.907905   54890 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1573263912-9222
E1109 01:45:35.966924   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1109 01:45:36.063884   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1109 01:45:36.152990   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I1109 01:45:36.436813   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2378", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I1109 01:45:36.439861   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"11f80caa-cf02-4af9-9470-3c70dd6ba421", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-5vd5t
I1109 01:45:36.443330   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"11f80caa-cf02-4af9-9470-3c70dd6ba421", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-9mrsp
I1109 01:45:36.443610   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"11f80caa-cf02-4af9-9470-3c70dd6ba421", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-ddclt
configmap/test-set-env-config created
secret/test-set-env-secret created
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE1109 01:45:36.870200   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(BE1109 01:45:36.968018   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(BE1109 01:45:37.065192   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I1109 01:45:37.111737   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2396", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I1109 01:45:37.116631   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-6b9f7756b4", UID:"10248ce2-ddd1-494d-a6e2-c3f6a5abae58", APIVersion:"apps/v1", ResourceVersion:"2397", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-lsmgq
E1109 01:45:37.154308   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
(Bapps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
(Bdeployment.apps/nginx-deployment env updated
I1109 01:45:37.393903   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2406", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I1109 01:45:37.399320   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2408", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-754bf964c8 to 1
I1109 01:45:37.401600   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"11f80caa-cf02-4af9-9470-3c70dd6ba421", APIVersion:"apps/v1", ResourceVersion:"2410", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-5vd5t
... skipping 8 lines ...
I1109 01:45:37.691656   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2446", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 0
I1109 01:45:37.697526   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-598d4d68b4", UID:"11f80caa-cf02-4af9-9470-3c70dd6ba421", APIVersion:"apps/v1", ResourceVersion:"2450", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-9mrsp
I1109 01:45:37.702629   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2448", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
I1109 01:45:37.708103   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-5958f7687", UID:"a891cfcd-dfb7-47d3-8ced-e9f69ead235c", APIVersion:"apps/v1", ResourceVersion:"2455", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-sfm8f
deployment.apps/nginx-deployment env updated
I1109 01:45:37.805008   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2464", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-5958f7687 to 0
E1109 01:45:37.864912   54890 replica_set.go:488] Sync "namespace-1573263923-9075/nginx-deployment-5958f7687" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-5958f7687": the object has been modified; please apply your changes to the latest version and try again
E1109 01:45:37.871088   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I1109 01:45:37.912407   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment", UID:"59ffc35b-e4e1-4562-8e2f-b742db5e71a0", APIVersion:"apps/v1", ResourceVersion:"2466", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-98b7fd455 to 1
I1109 01:45:37.968977   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263923-9075", Name:"nginx-deployment-5958f7687", UID:"a891cfcd-dfb7-47d3-8ced-e9f69ead235c", APIVersion:"apps/v1", ResourceVersion:"2467", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-5958f7687-sfm8f
E1109 01:45:37.969597   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
deployment.apps "nginx-deployment" deleted
E1109 01:45:38.067163   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-set-env-config" deleted
E1109 01:45:38.155291   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:38.169572   54890 replica_set.go:488] Sync "namespace-1573263923-9075/nginx-deployment-98b7fd455" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-98b7fd455": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1573263923-9075/nginx-deployment-98b7fd455, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 988f38fc-ee0f-42be-8399-5985452b77f0, UID in object meta: 
secret "test-set-env-secret" deleted
+++ exit code: 0
Recording: run_rs_tests
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
E1109 01:45:38.314490   54890 replica_set.go:488] Sync "namespace-1573263923-9075/nginx-deployment-5958f7687" failed with replicasets.apps "nginx-deployment-5958f7687" not found
+++ [1109 01:45:38] Creating namespace namespace-1573263938-15170
E1109 01:45:38.364527   54890 replica_set.go:488] Sync "namespace-1573263923-9075/nginx-deployment-868b664cb5" failed with replicasets.apps "nginx-deployment-868b664cb5" not found
namespace/namespace-1573263938-15170 created
Context "test" modified.
+++ [1109 01:45:38] Testing kubectl(v1:replicasets)
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I1109 01:45:38.703513   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"95ad0d59-49c5-4313-a0a3-d79bd268bbbe", APIVersion:"apps/v1", ResourceVersion:"2496", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-svvpq
I1109 01:45:38.705850   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"95ad0d59-49c5-4313-a0a3-d79bd268bbbe", APIVersion:"apps/v1", ResourceVersion:"2496", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ck6jj
I1109 01:45:38.706126   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"95ad0d59-49c5-4313-a0a3-d79bd268bbbe", APIVersion:"apps/v1", ResourceVersion:"2496", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-z4krx
+++ [1109 01:45:38] Deleting rs
replicaset.apps "frontend" deleted
E1109 01:45:38.813633   54890 replica_set.go:488] Sync "namespace-1573263938-15170/frontend" failed with replicasets.apps "frontend" not found
E1109 01:45:38.872179   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:38.970796   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:39.068466   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend-no-cascade created
I1109 01:45:39.111974   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend-no-cascade", UID:"82cc14e6-fd95-451a-92dd-79833ea756bd", APIVersion:"apps/v1", ResourceVersion:"2513", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-qwgnv
I1109 01:45:39.114180   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend-no-cascade", UID:"82cc14e6-fd95-451a-92dd-79833ea756bd", APIVersion:"apps/v1", ResourceVersion:"2513", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-hdhg8
I1109 01:45:39.115621   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend-no-cascade", UID:"82cc14e6-fd95-451a-92dd-79833ea756bd", APIVersion:"apps/v1", ResourceVersion:"2513", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-9gq92
E1109 01:45:39.156683   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:527: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [1109 01:45:39] Deleting rs
replicaset.apps "frontend-no-cascade" deleted
E1109 01:45:39.364625   54890 replica_set.go:488] Sync "namespace-1573263938-15170/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
apps.sh:531: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:533: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-no-cascade-9gq92" deleted
pod "frontend-no-cascade-hdhg8" deleted
pod "frontend-no-cascade-qwgnv" deleted
apps.sh:536: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:540: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:39.873373   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I1109 01:45:39.904279   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"450820f9-4cf4-422d-9022-3b30e13597e1", APIVersion:"apps/v1", ResourceVersion:"2534", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-2qhq6
I1109 01:45:39.907795   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"450820f9-4cf4-422d-9022-3b30e13597e1", APIVersion:"apps/v1", ResourceVersion:"2534", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hqfgh
I1109 01:45:39.907873   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"450820f9-4cf4-422d-9022-3b30e13597e1", APIVersion:"apps/v1", ResourceVersion:"2534", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rqc8f
E1109 01:45:39.971758   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:544: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE1109 01:45:40.069483   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 3 lines ...
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-2qhq6
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-hqfgh
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-rqc8f
(BE1109 01:45:40.157850   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:548: Successful describe
Name:         frontend
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
Namespace:    namespace-1573263938-15170
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-2qhq6
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-hqfgh
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-rqc8f
(BE1109 01:45:40.874638   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Image:
matched Node:
matched Labels:
matched Status:
matched Controlled By
... skipping 80 lines ...
    Mounts:            <none>
Volumes:               <none>
QoS Class:             Burstable
Node-Selectors:        <none>
Tolerations:           <none>
Events:                <none>
(BE1109 01:45:40.972955   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:566: Successful get rs frontend {{.spec.replicas}}: 3
(BE1109 01:45:41.071490   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend scaled
E1109 01:45:41.138724   54890 replica_set.go:202] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1573263938-15170 /apis/apps/v1/namespaces/namespace-1573263938-15170/replicasets/frontend 450820f9-4cf4-422d-9022-3b30e13597e1 2546 2 2019-11-09 01:45:39 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0022331e8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I1109 01:45:41.144688   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"450820f9-4cf4-422d-9022-3b30e13597e1", APIVersion:"apps/v1", ResourceVersion:"2546", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-2qhq6
E1109 01:45:41.158692   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:570: Successful get rs frontend {{.spec.replicas}}: 2
(Bdeployment.apps/scale-1 created
I1109 01:45:41.408038   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263938-15170", Name:"scale-1", UID:"100abdb1-f309-4912-a80b-1adc15c325f8", APIVersion:"apps/v1", ResourceVersion:"2552", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I1109 01:45:41.411001   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"scale-1-5c5565bcd9", UID:"e86ccdf1-bc9b-45b1-b241-9c4d9e951019", APIVersion:"apps/v1", ResourceVersion:"2553", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-mc6mg
deployment.apps/scale-2 created
I1109 01:45:41.588063   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263938-15170", Name:"scale-2", UID:"80223e58-5098-4636-acab-dceea06bd433", APIVersion:"apps/v1", ResourceVersion:"2562", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I1109 01:45:41.592841   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"scale-2-5c5565bcd9", UID:"a1d43dcd-1446-42ec-8647-c0bc64a6b05d", APIVersion:"apps/v1", ResourceVersion:"2563", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-zf894
deployment.apps/scale-3 created
I1109 01:45:41.763044   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263938-15170", Name:"scale-3", UID:"dd897871-8bb7-4d3a-ad19-04aeac7d213c", APIVersion:"apps/v1", ResourceVersion:"2572", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I1109 01:45:41.771147   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"scale-3-5c5565bcd9", UID:"0da7dd10-ae86-4aa5-baad-7669787fbcb5", APIVersion:"apps/v1", ResourceVersion:"2573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-8qjdv
apps.sh:576: Successful get deploy scale-1 {{.spec.replicas}}: 1
(BE1109 01:45:41.875963   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:577: Successful get deploy scale-2 {{.spec.replicas}}: 1
(BE1109 01:45:41.974298   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:578: Successful get deploy scale-3 {{.spec.replicas}}: 1
(BE1109 01:45:42.072808   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 scaled
I1109 01:45:42.107317   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263938-15170", Name:"scale-1", UID:"100abdb1-f309-4912-a80b-1adc15c325f8", APIVersion:"apps/v1", ResourceVersion:"2582", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
I1109 01:45:42.109897   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"scale-1-5c5565bcd9", UID:"e86ccdf1-bc9b-45b1-b241-9c4d9e951019", APIVersion:"apps/v1", ResourceVersion:"2583", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-p9cfw
deployment.apps/scale-2 scaled
I1109 01:45:42.115831   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263938-15170", Name:"scale-2", UID:"80223e58-5098-4636-acab-dceea06bd433", APIVersion:"apps/v1", ResourceVersion:"2584", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
I1109 01:45:42.123377   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"scale-2-5c5565bcd9", UID:"a1d43dcd-1446-42ec-8647-c0bc64a6b05d", APIVersion:"apps/v1", ResourceVersion:"2590", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-j8rdh
E1109 01:45:42.159995   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:581: Successful get deploy scale-1 {{.spec.replicas}}: 2
(Bapps.sh:582: Successful get deploy scale-2 {{.spec.replicas}}: 2
(Bapps.sh:583: Successful get deploy scale-3 {{.spec.replicas}}: 1
(BI1109 01:45:42.436829   54890 horizontal.go:341] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1573263923-9075
deployment.apps/scale-1 scaled
I1109 01:45:42.458040   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263938-15170", Name:"scale-1", UID:"100abdb1-f309-4912-a80b-1adc15c325f8", APIVersion:"apps/v1", ResourceVersion:"2602", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 3
... skipping 6 lines ...
I1109 01:45:42.475573   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"scale-3-5c5565bcd9", UID:"0da7dd10-ae86-4aa5-baad-7669787fbcb5", APIVersion:"apps/v1", ResourceVersion:"2614", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-d4xgp
I1109 01:45:42.479557   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"scale-3-5c5565bcd9", UID:"0da7dd10-ae86-4aa5-baad-7669787fbcb5", APIVersion:"apps/v1", ResourceVersion:"2614", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-lvsb5
apps.sh:586: Successful get deploy scale-1 {{.spec.replicas}}: 3
(Bapps.sh:587: Successful get deploy scale-2 {{.spec.replicas}}: 3
(Bapps.sh:588: Successful get deploy scale-3 {{.spec.replicas}}: 3
(Breplicaset.apps "frontend" deleted
E1109 01:45:42.877231   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
E1109 01:45:42.975477   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:43.074235   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I1109 01:45:43.082206   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"43ce0d02-c29a-4715-a209-6554628436ae", APIVersion:"apps/v1", ResourceVersion:"2665", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-cgplw
I1109 01:45:43.086086   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"43ce0d02-c29a-4715-a209-6554628436ae", APIVersion:"apps/v1", ResourceVersion:"2665", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jd2bh
I1109 01:45:43.087474   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"43ce0d02-c29a-4715-a209-6554628436ae", APIVersion:"apps/v1", ResourceVersion:"2665", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-2w4dj
E1109 01:45:43.161188   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:596: Successful get rs frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
apps.sh:600: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
apps.sh:604: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice "frontend" deleted
service "frontend-2" deleted
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 1
(Breplicaset.apps/frontend image updated
E1109 01:45:43.878361   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 2
(Breplicaset.apps/frontend env updated
E1109 01:45:43.976657   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 3
(BE1109 01:45:44.075674   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend resource requirements updated
E1109 01:45:44.162457   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:616: Successful get rs frontend {{.metadata.generation}}: 4
(Bapps.sh:620: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
apps.sh:624: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:628: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I1109 01:45:44.726395   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"9ebc6d2d-7be8-4f4d-b617-43c6998674b2", APIVersion:"apps/v1", ResourceVersion:"2703", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qf2jt
I1109 01:45:44.729622   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"9ebc6d2d-7be8-4f4d-b617-43c6998674b2", APIVersion:"apps/v1", ResourceVersion:"2703", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9c9f8
I1109 01:45:44.731393   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"9ebc6d2d-7be8-4f4d-b617-43c6998674b2", APIVersion:"apps/v1", ResourceVersion:"2703", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fc74t
E1109 01:45:44.879652   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/redis-slave created
I1109 01:45:44.887725   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"redis-slave", UID:"bca7369f-8895-4afc-999a-738bda4c06bc", APIVersion:"apps/v1", ResourceVersion:"2714", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-tbr4h
I1109 01:45:44.891220   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"redis-slave", UID:"bca7369f-8895-4afc-999a-738bda4c06bc", APIVersion:"apps/v1", ResourceVersion:"2714", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-k8hrz
E1109 01:45:44.977870   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:633: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bapps.sh:637: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE1109 01:45:45.076977   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
replicaset.apps "redis-slave" deleted
E1109 01:45:45.163261   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:641: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:646: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I1109 01:45:45.496663   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"2b0d7030-3669-4baf-863e-75cd85664f9b", APIVersion:"apps/v1", ResourceVersion:"2733", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-k9jds
I1109 01:45:45.500571   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"2b0d7030-3669-4baf-863e-75cd85664f9b", APIVersion:"apps/v1", ResourceVersion:"2733", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wwbd8
I1109 01:45:45.501037   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263938-15170", Name:"frontend", UID:"2b0d7030-3669-4baf-863e-75cd85664f9b", APIVersion:"apps/v1", ResourceVersion:"2733", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xlsxx
apps.sh:649: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:652: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
E1109 01:45:45.880910   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E1109 01:45:45.979072   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:656: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
E1109 01:45:46.078073   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error: required flag(s) "max" not set


Examples:
  # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
  kubectl autoscale deployment foo --min=2 --max=10
  
... skipping 18 lines ...

Usage:
  kubectl autoscale (-f FILENAME | TYPE NAME | TYPE/NAME) [--min=MINPODS] --max=MAXPODS [--cpu-percent=CPU] [options]

Use "kubectl options" for a list of global command-line options (applies to all commands).

E1109 01:45:46.164489   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
... skipping 5 lines ...
+++ [1109 01:45:46] Testing kubectl(v1:statefulsets)
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BI1109 01:45:46.693022   51457 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(Bapps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(BE1109 01:45:46.882097   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx scaled
I1109 01:45:46.965115   54890 event.go:281] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1573263946-12549", Name:"nginx", UID:"92aba04e-8c72-4745-9bd8-cc8597c75ed4", APIVersion:"apps/v1", ResourceVersion:"2760", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
E1109 01:45:46.980131   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(BE1109 01:45:47.079269   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(BE1109 01:45:47.165850   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx restarted
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(Bstatefulset.apps "nginx" deleted
I1109 01:45:47.471393   54890 stateful_set.go:420] StatefulSet has been deleted namespace-1573263946-12549/nginx
+++ exit code: 0
Recording: run_statefulset_history_tests
... skipping 4 lines ...
+++ command: run_statefulset_history_tests
+++ [1109 01:45:47] Creating namespace namespace-1573263947-32759
namespace/namespace-1573263947-32759 created
Context "test" modified.
+++ [1109 01:45:47] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:47.883455   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:47.981191   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx created
E1109 01:45:48.080540   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1573263947-32759"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE1109 01:45:48.167123   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx configured
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE1109 01:45:48.884552   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1573263947-32759"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1573263947-32759"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE1109 01:45:48.982235   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx will roll back to Pod Template:
  Labels:	app=nginx-statefulset
  Containers:
   nginx:
    Image:	k8s.gcr.io/nginx-slim:0.7
    Port:	80/TCP
... skipping 4 lines ...
      while true; do sleep 1; done
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE1109 01:45:49.081746   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE1109 01:45:49.168123   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bstatefulset.apps/nginx rolled back
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
E1109 01:45:49.886442   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE1109 01:45:49.983583   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE1109 01:45:50.083120   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE1109 01:45:50.169567   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I1109 01:45:50.223869   54890 stateful_set.go:420] StatefulSet has been deleted namespace-1573263947-32759/nginx
+++ exit code: 0
Recording: run_lists_tests
Running command: run_lists_tests

... skipping 14 lines ...
Recording: run_multi_resources_tests
Running command: run_multi_resources_tests

+++ Running case: test-cmd.run_multi_resources_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_multi_resources_tests
E1109 01:45:50.887462   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [1109 01:45:50] Creating namespace namespace-1573263950-9995
namespace/namespace-1573263950-9995 created
E1109 01:45:50.984686   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1109 01:45:51] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
E1109 01:45:51.084452   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:45:51.171027   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I1109 01:45:51.406174   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"d1c31d82-e37b-4b14-b5c5-b962e54f7cca", APIVersion:"v1", ResourceVersion:"2825", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-lw84z
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
... skipping 19 lines ...
Name:         mock
Namespace:    namespace-1573263950-9995
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-lw84z
E1109 01:45:51.888597   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:51.986015   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I1109 01:45:52.040395   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"3bbfc138-127c-447f-8607-bb513f1a7f6c", APIVersion:"v1", ResourceVersion:"2842", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-jjvn7
E1109 01:45:52.085700   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE1109 01:45:52.172340   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE1109 01:45:52.889367   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E1109 01:45:52.987219   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE1109 01:45:53.086711   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
E1109 01:45:53.173270   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I1109 01:45:53.506149   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"4e418fd3-ea44-4141-a812-454700453727", APIVersion:"v1", ResourceVersion:"2868", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-qwh4l
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.254   <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
E1109 01:45:53.890591   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1573263950-9995
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1573263950-9995
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-qwh4l
E1109 01:45:53.988464   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:54.088069   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I1109 01:45:54.122993   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"613f3aa6-6f17-4870-b1d4-b9b33944c504", APIVersion:"v1", ResourceVersion:"2884", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-b6cl7
E1109 01:45:54.174488   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE1109 01:45:54.891373   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
E1109 01:45:54.989298   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE1109 01:45:55.089224   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE1109 01:45:55.175724   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
... skipping 3 lines ...
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.3     <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
E1109 01:45:55.892552   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:55.990463   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1573263950-9995
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1573263950-9995
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-br2bp
E1109 01:45:56.090580   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:56.176849   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I1109 01:45:56.215766   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"d698596e-ce66-46c9-990c-eaa0a23b9764", APIVersion:"v1", ResourceVersion:"2930", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-kxm78
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
E1109 01:45:56.893812   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE1109 01:45:56.991669   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE1109 01:45:57.091532   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E1109 01:45:57.178119   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 3 lines ...
I1109 01:45:57.693576   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"d36e81a3-c64c-441f-86b7-3903c4a263e4", APIVersion:"v1", ResourceVersion:"2953", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-hmvt2
I1109 01:45:57.694499   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock2", UID:"dc64d34f-c390-4b7f-86a8-9a9929efbaa6", APIVersion:"v1", ResourceVersion:"2954", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-88hnl
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    DESIRED   CURRENT   READY   AGE
mock    1         1         0       0s
mock2   1         1         0       0s
E1109 01:45:57.895170   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:57.992828   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:         mock
Namespace:    namespace-1573263950-9995
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1573263950-9995
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock2-88hnl
E1109 01:45:58.092716   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:58.179242   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
replicationcontroller/mock replaced
I1109 01:45:58.210515   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"74019f32-a402-49fd-8428-57f2050a264b", APIVersion:"v1", ResourceVersion:"2969", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-g49bs
replicationcontroller/mock2 replaced
I1109 01:45:58.213395   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock2", UID:"37f84c9c-7c17-4861-a231-6b8853fc66b6", APIVersion:"v1", ResourceVersion:"2971", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-nk6rg
... skipping 2 lines ...
(Breplicationcontroller/mock edited
replicationcontroller/mock2 edited
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(Breplicationcontroller/mock labeled
replicationcontroller/mock2 labeled
E1109 01:45:58.896356   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE1109 01:45:58.994086   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(Breplicationcontroller/mock annotated
replicationcontroller/mock2 annotated
E1109 01:45:59.093717   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE1109 01:45:59.180390   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(Breplicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
service/mock2 created
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
mock    ClusterIP   10.0.0.198   <none>        99/TCP    0s
mock2   ClusterIP   10.0.0.41    <none>        99/TCP    0s
E1109 01:45:59.897675   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:45:59.995102   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1573263950-9995
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 14 lines ...
IP:                10.0.0.41
Port:              <unset>  99/TCP
TargetPort:        9949/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
E1109 01:46:00.096780   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
E1109 01:46:00.181530   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock2" deleted
service/mock replaced
service/mock2 replaced
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(Bservice/mock edited
service/mock2 edited
I1109 01:46:00.662126   54890 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1573263938-15170
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(Bservice/mock labeled
service/mock2 labeled
E1109 01:46:00.898902   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE1109 01:46:00.996523   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(BE1109 01:46:01.097964   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
service/mock2 annotated
E1109 01:46:01.183152   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
service "mock2" deleted
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:46:01.900544   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I1109 01:46:01.907676   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263950-9995", Name:"mock", UID:"24f307aa-fcce-4a46-b377-e43e5a0c61ca", APIVersion:"v1", ResourceVersion:"3044", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-hvtc4
E1109 01:46:01.997811   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE1109 01:46:02.099297   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:02.184508   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volumes_tests
... skipping 4 lines ...
+++ command: run_persistent_volumes_tests
+++ [1109 01:46:02] Creating namespace namespace-1573263962-25262
namespace/namespace-1573263962-25262 created
Context "test" modified.
+++ [1109 01:46:02] Testing persistent volumes
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:46:02.901775   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
E1109 01:46:02.954005   54890 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
E1109 01:46:02.999018   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BE1109 01:46:03.100607   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0001" deleted
E1109 01:46:03.185637   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0002 created
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(Bpersistentvolume "pv0002" deleted
persistentvolume/pv0003 created
E1109 01:46:03.599506   54890 pv_protection_controller.go:116] PV pv0003 failed with : Operation cannot be fulfilled on persistentvolumes "pv0003": the object has been modified; please apply your changes to the latest version and try again
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(Bpersistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:46:03.903008   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:04.000258   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
E1109 01:46:04.102462   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BE1109 01:46:04.186858   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
... skipping 13 lines ...
+++ [1109 01:46:04] Testing persistent volumes claims
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolumeclaim/myclaim-1 created
I1109 01:46:04.768156   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-1", UID:"74724408-e894-49c9-a130-42ebcf6d5203", APIVersion:"v1", ResourceVersion:"3083", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I1109 01:46:04.773042   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-1", UID:"74724408-e894-49c9-a130-42ebcf6d5203", APIVersion:"v1", ResourceVersion:"3085", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(BE1109 01:46:04.904294   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim "myclaim-1" deleted
I1109 01:46:04.936146   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-1", UID:"74724408-e894-49c9-a130-42ebcf6d5203", APIVersion:"v1", ResourceVersion:"3087", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E1109 01:46:05.001457   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:46:05.102058   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-2", UID:"ad2c4086-8019-4e9c-8932-966b0e4f471c", APIVersion:"v1", ResourceVersion:"3092", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-2 created
E1109 01:46:05.103734   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:46:05.105935   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-2", UID:"ad2c4086-8019-4e9c-8932-966b0e4f471c", APIVersion:"v1", ResourceVersion:"3094", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E1109 01:46:05.188031   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(Bpersistentvolumeclaim "myclaim-2" deleted
I1109 01:46:05.269623   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-2", UID:"ad2c4086-8019-4e9c-8932-966b0e4f471c", APIVersion:"v1", ResourceVersion:"3096", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-3 created
I1109 01:46:05.436699   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-3", UID:"08ea5c26-7855-4cec-ba4c-b445ac814b76", APIVersion:"v1", ResourceVersion:"3099", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I1109 01:46:05.441692   54890 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1573263964-31739", Name:"myclaim-3", UID:"08ea5c26-7855-4cec-ba4c-b445ac814b76", APIVersion:"v1", ResourceVersion:"3101", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
... skipping 7 lines ...

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [1109 01:46:05] Testing storage class
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:46:05.905544   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:06.002647   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storageclass.storage.k8s.io/storage-class-name created
E1109 01:46:06.104843   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE1109 01:46:06.189296   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorageclass.storage.k8s.io "storage-class-name" deleted
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_nodes_tests
Running command: run_nodes_tests
... skipping 146 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
(B
E1109 01:46:06.906837   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1377: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 09 Nov 2019 01:41:53 +0000
... skipping 35 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(B
E1109 01:46:07.003863   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
... skipping 41 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE1109 01:46:07.106090   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 09 Nov 2019 01:41:53 +0000
... skipping 34 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE1109 01:46:07.190665   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 09 Nov 2019 01:41:53 +0000
... skipping 85 lines ...
(Bcore.sh:1389: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 patched
core.sh:1392: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(Bnode/127.0.0.1 patched
core.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Btokenreview.authentication.k8s.io/<unknown> created
E1109 01:46:07.908084   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests
E1109 01:46:08.005125   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_authorization_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_authorization_tests
+++ [1109 01:46:08] Testing authorization
subjectaccessreview.authorization.k8s.io/<unknown> created
E1109 01:46:08.107441   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
subjectaccessreview.authorization.k8s.io/<unknown> created
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   826  100   524  100   302   127k  75500 --:--:-- --:--:-- --:--:--  201k
E1109 01:46:08.191752   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [1109 01:46:08] "authorization.k8s.io/subjectaccessreviews" returns as expected: {
  "kind": "SubjectAccessReview",
  "apiVersion": "authorization.k8s.io/v1beta1",
  "metadata": {
    "creationTimestamp": null
  },
... skipping 52 lines ...
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
E1109 01:46:08.909227   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
0
has:0
E1109 01:46:09.006310   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:0
has:0
E1109 01:46:09.108844   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning
E1109 01:46:09.193166   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'foo'
yes
has:Warning: the server doesn't have a resource type 'foo'
Successful
message:Warning: the server doesn't have a resource type 'foo'
... skipping 23 lines ...
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
role.rbac.authorization.k8s.io/testing-R reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(BE1109 01:46:09.910510   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(BE1109 01:46:10.007591   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(BE1109 01:46:10.110015   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
E1109 01:46:10.194701   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
Recording: run_retrieve_multiple_tests
... skipping 16 lines ...
namespace/namespace-1573263970-25006 created
Context "test" modified.
+++ [1109 01:46:10] Testing resource aliasing
replicationcontroller/cassandra created
I1109 01:46:10.886554   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263970-25006", Name:"cassandra", UID:"cb412567-3368-46f4-be74-1d06605c6b3d", APIVersion:"v1", ResourceVersion:"3129", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-gwzg5
I1109 01:46:10.890276   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263970-25006", Name:"cassandra", UID:"cb412567-3368-46f4-be74-1d06605c6b3d", APIVersion:"v1", ResourceVersion:"3129", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-vgs77
E1109 01:46:10.911578   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:11.008823   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/cassandra created
E1109 01:46:11.111392   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
discovery.sh:89: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(BE1109 01:46:11.196989   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:46:11.285078   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263970-25006", Name:"cassandra", UID:"cb412567-3368-46f4-be74-1d06605c6b3d", APIVersion:"v1", ResourceVersion:"3135", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-zfj5q
pod "cassandra-gwzg5" deleted
pod "cassandra-vgs77" deleted
I1109 01:46:11.297702   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263970-25006", Name:"cassandra", UID:"cb412567-3368-46f4-be74-1d06605c6b3d", APIVersion:"v1", ResourceVersion:"3148", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-8f8ls
replicationcontroller "cassandra" deleted
service "cassandra" deleted
... skipping 76 lines ...

FIELD:    message <string>

DESCRIPTION:
     A human readable message indicating details about why the pod is in this
     condition.
E1109 01:46:11.912659   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:12.010154   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     CronJob
VERSION:  batch/v1beta1

DESCRIPTION:
     CronJob represents the configuration of a single cron job.

... skipping 23 lines ...
     Current status of a cron job. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

+++ exit code: 0
Recording: run_swagger_tests
Running command: run_swagger_tests
E1109 01:46:12.112707   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_swagger_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_swagger_tests
+++ [1109 01:46:12] Testing swagger
E1109 01:46:12.198259   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_sort_by_tests
Running command: run_kubectl_sort_by_tests

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 6 lines ...
(Bpod/valid-pod created
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
E1109 01:46:12.913777   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I1109 01:46:12.963231   86558 loader.go:375] Config loaded from file:  /tmp/tmp.JlPNjhJbR0/.kube/config
I1109 01:46:12.970544   86558 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1573263970-25006/pods?includeObject=Object
I1109 01:46:12.970592   86558 round_trippers.go:427] Request Headers:
I1109 01:46:12.970618   86558 round_trippers.go:431]     User-Agent: kubectl/v1.18.0 (linux/amd64) kubernetes/ff78676
I1109 01:46:12.970629   86558 round_trippers.go:431]     Accept: application/json;as=Table;v=v1beta1;g=meta.k8s.io, application/json
... skipping 18 lines ...
I1109 01:46:12.974057   86558 round_trippers.go:452]     Date: Sat, 09 Nov 2019 01:46:12 GMT
I1109 01:46:12.974069   86558 round_trippers.go:452]     Cache-Control: no-cache, private
I1109 01:46:12.974172   86558 request.go:989] Response Body: {"kind":"Table","apiVersion":"meta.k8s.io/v1beta1","metadata":{"selfLink":"/api/v1/namespaces/namespace-1573263970-25006/pods","resourceVersion":"3162"},"columnDefinitions":[{"name":"Name","type":"string","format":"name","description":"Name must be unique within a namespace. Is required when creating resources, although some resources may allow a client to request the generation of an appropriate name automatically. Name is primarily intended for creation idempotence and configuration definition. Cannot be updated. More info: http://kubernetes.io/docs/user-guide/identifiers#names","priority":0},{"name":"Ready","type":"string","format":"","description":"The aggregate readiness state of this pod for accepting traffic.","priority":0},{"name":"Status","type":"string","format":"","description":"The aggregate status of the containers in this pod.","priority":0},{"name":"Restarts","type":"integer","format":"","description":"The number of times the containers in this pod have been restarted.","priority":0},{"name":"A [truncated 2913 chars]
NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:includeObject=Object
E1109 01:46:13.011358   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:279: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE1109 01:46:13.119258   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E1109 01:46:13.199560   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/sorted-pod1 created
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(Bpod/sorted-pod2 created
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(BE1109 01:46:13.915017   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod3 created
E1109 01:46:14.012634   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BE1109 01:46:14.120484   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
E1109 01:46:14.200831   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod3:sorted-pod2:sorted-pod1:
has:sorted-pod3:sorted-pod2:sorted-pod1:
Successful
message:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
... skipping 31 lines ...
Running command: run_kubectl_all_namespace_tests

+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_all_namespace_tests
+++ [1109 01:46:14] Testing kubectl --all-namespace
E1109 01:46:14.916164   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(Bget.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:46:15.013837   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:15.121728   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E1109 01:46:15.202103   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BNAMESPACE                    NAME        READY   STATUS    RESTARTS   AGE
namespace-1573263970-25006   valid-pod   0/1     Pending   0          0s
namespace/all-ns-test-1 created
serviceaccount/test created
namespace/all-ns-test-2 created
... skipping 120 lines ...
namespace-1573263962-25262   default   0         13s
namespace-1573263964-31739   default   0         11s
namespace-1573263970-25006   default   0         5s
some-other-random            default   0         6s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
E1109 01:46:15.917453   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:16.014982   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:16.122994   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:16.203513   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:16.918629   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:17.016219   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:17.124265   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:17.204902   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:17.919859   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:18.017581   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:18.125622   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:18.206231   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:18.921049   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:19.019222   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:19.126932   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:19.207480   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:19.922175   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:20.020462   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:20.128326   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:20.209093   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:20.923054   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:21.021697   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E1109 01:46:21.129708   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:21.210562   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:21.924319   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:22.022968   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:22.130977   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:22.211908   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:22.925559   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:23.024261   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:23.132337   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:23.213264   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:23.926685   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:24.025319   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:24.133495   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:24.214570   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:24.928754   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:25.026515   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:25.134745   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:25.215909   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:25.930054   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:46:25.967532   54890 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E1109 01:46:26.027895   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:26.135857   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:26.217153   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
... skipping 9 lines ...
+++ command: run_template_output_tests
+++ [1109 01:46:26] Testing --template support on commands
+++ [1109 01:46:26] Creating namespace namespace-1573263986-4880
namespace/namespace-1573263986-4880 created
Context "test" modified.
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1109 01:46:26.931420   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:27.029211   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "v1",
... skipping 46 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E1109 01:46:27.136950   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:35: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE1109 01:46:27.218467   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
... skipping 9 lines ...
Successful
message:scale-1:
has:scale-1:
Successful
message:redis-slave:
has:redis-slave:
E1109 01:46:27.932544   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
Successful
message:nginx:
has:nginx:
E1109 01:46:28.030653   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
Successful
message:pi:
has:pi:
E1109 01:46:28.138152   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:127.0.0.1:
has:127.0.0.1:
E1109 01:46:28.219754   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
replicationcontroller/cassandra created
I1109 01:46:28.401215   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263986-4880", Name:"cassandra", UID:"a41ee27d-a0ad-4d8f-8dfd-e89c75ffbdf2", APIVersion:"v1", ResourceVersion:"3216", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-x24jb
I1109 01:46:28.405390   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1573263986-4880", Name:"cassandra", UID:"a41ee27d-a0ad-4d8f-8dfd-e89c75ffbdf2", APIVersion:"v1", ResourceVersion:"3216", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-9jsnm
Successful
message:cassandra:
... skipping 24 lines ...
has:cm:
I1109 01:46:28.897352   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263986-4880", Name:"deploy", UID:"479ec25b-e79d-4b44-b759-fa66215f7e2e", APIVersion:"apps/v1", ResourceVersion:"3225", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
I1109 01:46:28.900042   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263986-4880", Name:"deploy-74bcc58696", UID:"4c49dfb1-3310-44fc-aec0-37d91ac21869", APIVersion:"apps/v1", ResourceVersion:"3226", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-8r6dn
Successful
message:deploy:
has:deploy:
E1109 01:46:28.933639   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:29.031865   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch/pi created
E1109 01:46:29.139157   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E1109 01:46:29.221174   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:bar:
has:bar:
Successful
message:foo:
has:foo:
... skipping 18 lines ...
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E1109 01:46:29.934757   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:kubernetes:
has:kubernetes:
E1109 01:46:30.033241   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E1109 01:46:30.140557   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E1109 01:46:30.222582   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 30 lines ...
preferences: {}
users: null
has:kind: Config
Successful
message:deploy:
has:deploy:
E1109 01:46:30.937111   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E1109 01:46:31.034470   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E1109 01:46:31.141653   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1109 01:46:31.147565   54890 namespace_controller.go:185] Namespace has been deleted all-ns-test-2
Successful
message:deploy:
has:deploy:
E1109 01:46:31.223954   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Config:
has:Config
Successful
message:apiVersion: v1
kind: ConfigMap
... skipping 17 lines ...
Recording: run_certificates_tests
Running command: run_certificates_tests

+++ Running case: test-cmd.run_certificates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
E1109 01:46:31.938328   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [1109 01:46:31] Testing certificates
E1109 01:46:32.035745   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E1109 01:46:32.142968   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE1109 01:46:32.225091   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
... skipping 38 lines ...
certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
E1109 01:46:32.939565   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 31 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E1109 01:46:33.036929   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE1109 01:46:33.144283   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E1109 01:46:33.226451   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
... skipping 38 lines ...
        "selfLink": ""
    }
}
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE1109 01:46:33.940831   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:34.038191   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E1109 01:46:34.145272   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
E1109 01:46:34.227497   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 44 lines ...
+++ Running case: test-cmd.run_cluster_management_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
+++ [1109 01:46:34] Testing cluster-management commands
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bpod/test-pod-1 created
E1109 01:46:34.942170   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:35.039601   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-2 created
E1109 01:46:35.146504   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
E1109 01:46:35.228653   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(Bnode/127.0.0.1 untainted
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
E1109 01:46:35.943288   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE1109 01:46:36.040792   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE1109 01:46:36.147944   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(BE1109 01:46:36.229937   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned
node/127.0.0.1 drained
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(Bpod "test-pod-2" deleted
node/127.0.0.1 uncordoned
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BSuccessful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
E1109 01:46:36.944242   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BE1109 01:46:37.041997   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
E1109 01:46:37.148755   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
E1109 01:46:37.231309   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 15 lines ...
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
E1109 01:46:37.945424   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
has:test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
E1109 01:46:38.043363   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Client Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.0.557+ff7867612869b2", GitCommit:"ff7867612869b221a22d73979ae02d2fb11c1e46", GitTreeState:"clean", BuildDate:"2019-11-08T21:49:13Z", GoVersion:"go1.13.4", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.0.557+ff7867612869b2", GitCommit:"ff7867612869b221a22d73979ae02d2fb11c1e46", GitTreeState:"clean", BuildDate:"2019-11-08T21:49:13Z", GoVersion:"go1.13.4", Compiler:"gc", Platform:"linux/amd64"}
has:Client Version
Successful
message:Client Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.0.557+ff7867612869b2", GitCommit:"ff7867612869b221a22d73979ae02d2fb11c1e46", GitTreeState:"clean", BuildDate:"2019-11-08T21:49:13Z", GoVersion:"go1.13.4", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.0.557+ff7867612869b2", GitCommit:"ff7867612869b221a22d73979ae02d2fb11c1e46", GitTreeState:"clean", BuildDate:"2019-11-08T21:49:13Z", GoVersion:"go1.13.4", Compiler:"gc", Platform:"linux/amd64"}
has not:overshadows an existing plugin
+++ exit code: 0
Recording: run_impersonation_tests
Running command: run_impersonation_tests
E1109 01:46:38.150017   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [1109 01:46:38] Testing impersonation
E1109 01:46:38.232734   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificatesigningrequest.certificates.k8s.io/foo created
E1109 01:46:38.946628   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(BE1109 01:46:39.044962   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
E1109 01:46:39.151421   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_wait_tests
Running command: run_wait_tests

+++ Running case: test-cmd.run_wait_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_wait_tests
+++ [1109 01:46:39] Testing kubectl wait
+++ [1109 01:46:39] Creating namespace namespace-1573263999-4992
E1109 01:46:39.234099   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1573263999-4992 created
Context "test" modified.
deployment.apps/test-1 created
I1109 01:46:39.435553   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263999-4992", Name:"test-1", UID:"aec1631e-080a-4559-b62b-34ca9f55920c", APIVersion:"apps/v1", ResourceVersion:"3318", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
I1109 01:46:39.443147   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263999-4992", Name:"test-1-6d98955cc9", UID:"b38823e9-56bb-41b7-8467-d44314527b24", APIVersion:"apps/v1", ResourceVersion:"3319", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-ssvl6
deployment.apps/test-2 created
I1109 01:46:39.509707   54890 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1573263999-4992", Name:"test-2", UID:"eafefdfd-bfc5-476e-b89f-fcbeaf898d74", APIVersion:"apps/v1", ResourceVersion:"3328", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
I1109 01:46:39.520589   54890 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1573263999-4992", Name:"test-2-65897ff84d", UID:"f8d98cb9-43a6-4826-b12f-15396f7c411e", APIVersion:"apps/v1", ResourceVersion:"3329", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-c5hlq
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE1109 01:46:39.947924   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:40.046301   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:40.152723   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:40.235557   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:40.951172   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:41.047556   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:41.153993   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1109 01:46:41.236864   54890 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 34 lines ...
I1109 01:46:41.930890   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.930928   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931013   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931021   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931124   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931178   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.931266   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.931279   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931420   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931615   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931743   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931795   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931812   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934283   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934312   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.931866   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931894   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931960   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934338   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934347   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934356   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.931975   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.931983   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.932040   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934373   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932051   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934398   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932105   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934405   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932146   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934406   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932252   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934458   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934478   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932267   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934480   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934489   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934503   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934518   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932279   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934533   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934537   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932363   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932409   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934569   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932515   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932635   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932678   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934582   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.932708   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932787   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932804   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932876   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932938   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.932996   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934648   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934652   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.933033   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933034   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933061   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933084   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933119   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933259   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933277   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.933318   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933327   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934750   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933364   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933369   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933371   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933401   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933402   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933403   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933404   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933422   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933454   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933454   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933464   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933471   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933480   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933486   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933489   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933503   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933505   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933512   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933515   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933520   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933541   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933541   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933542   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933543   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933551   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933567   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933576   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933581   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933576   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933576   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933603   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933608   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933605   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933637   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933642   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933665   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.933673   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.933700   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933716   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933722   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933784   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933815   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933817   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933826   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933828   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933877   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.933931   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934063   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1109 01:46:41.934129   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.934141   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.934155   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1109 01:46:41.934182   51457 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1109 01:46:41.934307   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:41.934322   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [1109 01:46:41] Clean up complete
+ make test-integration
W1109 01:46:42.931778   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.934848   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.934914   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.934931   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.934958   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.934994   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935009   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935044   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935060   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935098   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935113   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935228   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935260   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935296   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935349   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935387   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935418   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935482   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935489   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935509   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935520   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935600   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935604   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935648   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935692   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935733   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935776   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935778   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935812   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935814   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935842   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935862   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935936   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935963   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.935976   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.936061   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.936064   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.936597   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937021   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937038   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937048   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937040   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937079   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937094   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937074   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937110   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937127   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937135   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937157   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937160   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937165   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937187   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937188   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937213   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937228   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937247   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937252   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937279   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937282   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937290   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937229   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937301   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937335   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937340   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937345   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:42.937355   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.216766   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.224213   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.233652   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.238534   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.246886   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.270370   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.273657   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.274477   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.284531   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.285550   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.288101   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.320605   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.327359   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.346548   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.356311   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.362692   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.365357   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.367897   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.369466   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.370310   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.376069   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.382315   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.383506   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.418274   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.426151   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.440181   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.463924   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.483405   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.514955   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.544612   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.561920   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.564857   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.565273   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.578573   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.594511   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.611096   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.618498   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.628721   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.656266   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.676568   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.683310   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.692259   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.708932   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.717817   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.722116   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.723078   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.727955   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.730969   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.761455   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.784717   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.789016   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.789304   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.809240   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.811785   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.819504   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.821384   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.826552   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.826811   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.834514   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.836630   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.840906   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.841321   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.843382   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.846722   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.847031   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1109 01:46:44.851989   51457 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [1109 01:46:45] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [1109 01:46:45] Starting etcd instance
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.q5i0zDZIR9 --listen-client-urls http://127.0.0.1:2379 --debug > "/logs/artifacts/etcd.a9486746-0290-11ea-8249-e2ed2883c286.root.log.DEBUG.20191109-014645.90850" 2>/dev/null
Waiting for etcd to come up.
+++ [1109 01:46:46] On try 2, etcd: : {"health":"true"}
... skipping 10756 lines ...
    synthetic_master_test.go:735: UPDATE_NODE_APISERVER is not set

=== SKIP: test/integration/scheduler_perf TestSchedule100Node3KPods (0.00s)
    scheduler_test.go:73: Skipping because we want to run short tests


=== Failed
=== FAIL: test/integration/etcd TestEtcdStoragePath (12.54s)
E1109 01:50:32.625962  107409 controller.go:183] Get https://127.0.0.1:35291/api/v1/namespaces/default/endpoints/kubernetes: dial tcp 127.0.0.1:35291: connect: connection refused
I1109 01:50:33.144486  107409 serving.go:306] Generated self-signed cert (/tmp/TestEtcdStoragePath937205931/apiserver.crt, /tmp/TestEtcdStoragePath937205931/apiserver.key)
I1109 01:50:33.144531  107409 server.go:622] external host was not specified, using 10.61.24.125
I1109 01:50:33.144932  107409 client.go:361] parsed scheme: "endpoint"
I1109 01:50:33.144980  107409 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W1109 01:50:33.740660  107409 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
... skipping 242 lines ...
    server.go:155: waiting for server to be healthy
    server.go:155: waiting for server to be healthy


DONE 2776 tests, 4 skipped, 1 failure in 5.454s
+++ [1109 01:58:13] Saved JUnit XML test report to /logs/artifacts/junit_304dbea7698c16157bb4586f231ea1f94495b046_20191109-014650.xml
make[1]: *** [Makefile:185: test] Error 1
!!! [1109 01:58:13] Call tree:
!!! [1109 01:58:13]  1: hack/make-rules/test-integration.sh:89 runTests(...)
+++ [1109 01:58:13] Cleaning up etcd
+++ [1109 01:58:13] Integration test cleanup complete
make: *** [Makefile:204: test-integration] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
[Barnacle] 2019/11/09 01:58:13 Cleaning up Docker data root...
[Barnacle] 2019/11/09 01:58:13 Removing all containers.
... skipping 12 lines ...