This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 2863 succeeded
Started2019-09-11 06:36
Elapsed26m51s
Revision
Buildergke-prow-ssd-pool-1a225945-z2ft
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/f4a1fc06-400a-48fb-a766-12c481257878/targets/test'}}
pod5bd14cc1-d45e-11e9-9d26-329cee23a2e0
resultstorehttps://source.cloud.google.com/results/invocations/f4a1fc06-400a-48fb-a766-12c481257878/targets/test
infra-commit0708557a1
pod5bd14cc1-d45e-11e9-9d26-329cee23a2e0
repok8s.io/kubernetes
repo-commit349143ec35a0999e0e851aef65d48457252d8b92
repos{u'k8s.io/kubernetes': u'master'}

Test Failures


k8s.io/kubernetes/test/integration/examples TestAggregatedAPIServer 11s

go test -v k8s.io/kubernetes/test/integration/examples -run TestAggregatedAPIServer$
=== RUN   TestAggregatedAPIServer
I0911 06:55:01.883936  107510 serving.go:312] Generated self-signed cert (/tmp/test-integration-apiserver464142136/apiserver.crt, /tmp/test-integration-apiserver464142136/apiserver.key)
I0911 06:55:01.883979  107510 server.go:623] external host was not specified, using 172.17.0.2
W0911 06:55:02.360080  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.360133  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.360146  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.360655  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.360701  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.360714  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.360725  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.360741  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362166  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362228  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362289  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362344  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362602  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362784  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362835  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:02.362945  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 06:55:02.362996  107510 plugins.go:158] Loaded 10 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0911 06:55:02.363007  107510 plugins.go:161] Loaded 7 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0911 06:55:02.363192  107510 master.go:259] Using reconciler: lease
I0911 06:55:02.363544  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.363593  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.365178  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.365215  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.368361  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.368400  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.369704  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.369745  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.376154  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.377274  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.380237  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.380640  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.383905  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.384202  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.388839  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.388893  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.390777  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.390811  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.392657  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.392684  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.394740  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.394931  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.396532  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.396698  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.398097  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.398238  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.401588  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.401792  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.405041  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.405088  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.407286  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.407444  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.409022  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.409157  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.411017  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.411064  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.412913  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.412967  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.414336  107510 rest.go:115] the default service ipfamily for this cluster is: IPv4
I0911 06:55:02.602694  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.602740  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.604123  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.604165  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.605995  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.606026  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.607457  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.607489  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.608940  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.608967  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.610577  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.610610  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.612318  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.612350  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.613764  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.613795  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.615032  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.615063  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.616820  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.616854  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.618863  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.618893  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.620716  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.620745  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.622896  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.623037  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.624628  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.624888  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.626607  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.626635  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.631222  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.631248  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.642257  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.642314  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.646450  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.646486  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.647863  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.647896  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.649660  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.649689  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.651177  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.651210  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.652698  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.652732  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.656398  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.656430  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.658848  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.658888  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.661285  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.661439  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.675362  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.675422  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.678756  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.678802  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.682021  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.682281  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.684233  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.684363  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.686408  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.686444  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.688822  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.689040  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.690975  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.691272  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.693117  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.693350  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.696555  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.696757  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.698281  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.698342  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.701645  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.701690  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.707170  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.707241  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.709803  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.709852  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.712573  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.713401  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:02.726078  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:02.726956  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0911 06:55:03.014932  107510 genericapiserver.go:404] Skipping API batch/v2alpha1 because it has no resources.
W0911 06:55:03.033926  107510 genericapiserver.go:404] Skipping API node.k8s.io/v1alpha1 because it has no resources.
W0911 06:55:03.054454  107510 genericapiserver.go:404] Skipping API rbac.authorization.k8s.io/v1alpha1 because it has no resources.
W0911 06:55:03.059266  107510 genericapiserver.go:404] Skipping API scheduling.k8s.io/v1alpha1 because it has no resources.
W0911 06:55:03.075466  107510 genericapiserver.go:404] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W0911 06:55:03.102265  107510 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0911 06:55:03.102324  107510 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0911 06:55:03.359536  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:03.359626  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:04.063251  107510 secure_serving.go:123] Serving securely on 127.0.0.1:42765
E0911 06:55:04.086601  107510 controller.go:154] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /registry/masterleases/172.17.0.2, ResourceVersion: 0, AdditionalErrorMsg: 
I0911 06:55:05.089743  107510 storage_scheduling.go:139] created PriorityClass system-node-critical with value 2000001000
I0911 06:55:05.094575  107510 storage_scheduling.go:139] created PriorityClass system-cluster-critical with value 2000000000
I0911 06:55:05.094603  107510 storage_scheduling.go:148] all system priority classes are created successfully or already exist.
I0911 06:55:06.863202  107510 controller.go:606] quota admission added evaluator for: roles.rbac.authorization.k8s.io
I0911 06:55:07.143345  107510 controller.go:606] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
W0911 06:55:07.449832  107510 lease.go:222] Resetting endpoints for master service "kubernetes" to [172.17.0.2]
I0911 06:55:07.450870  107510 controller.go:606] quota admission added evaluator for: endpoints
I0911 06:55:08.312243  107510 serving.go:312] Generated self-signed cert (/tmp/test-integration-wardle-server084719084/apiserver.crt, /tmp/test-integration-wardle-server084719084/apiserver.key)
I0911 06:55:09.313013  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:09.313166  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0911 06:55:09.411414  107510 authentication.go:281] Cluster doesn't provide client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0911 06:55:09.418277  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:09.419028  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:09.419816  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:09.419867  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 06:55:09.419888  107510 plugins.go:158] Loaded 3 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,MutatingAdmissionWebhook,BanFlunder.
I0911 06:55:09.419896  107510 plugins.go:161] Loaded 1 validating admission controller(s) successfully in the following order: ValidatingAdmissionWebhook.
I0911 06:55:09.422436  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:09.422625  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:09.425864  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:09.426031  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:09.427260  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:09.427318  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:09.493249  107510 secure_serving.go:123] Serving securely on 127.0.0.1:35253
I0911 06:55:10.523102  107510 serving.go:312] Generated self-signed cert (/tmp/test-integration-aggregator594504283/apiserver.crt, /tmp/test-integration-aggregator594504283/apiserver.key)
I0911 06:55:11.523733  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:11.523853  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0911 06:55:11.738889  107510 authentication.go:281] Cluster doesn't provide client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0911 06:55:11.738917  107510 authentication.go:296] Cluster doesn't provide requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0911 06:55:11.742483  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:11.742713  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:11.742894  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 06:55:11.742933  107510 plugins.go:158] Loaded 2 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,MutatingAdmissionWebhook.
I0911 06:55:11.742942  107510 plugins.go:161] Loaded 1 validating admission controller(s) successfully in the following order: ValidatingAdmissionWebhook.
W0911 06:55:11.742967  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0911 06:55:11.744611  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 06:55:11.744880  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:11.744911  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:55:11.745944  107510 client.go:361] parsed scheme: "endpoint"
I0911 06:55:11.745972  107510 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0911 06:55:11.749020  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 06:55:11.751352  107510 secure_serving.go:123] Serving securely on 127.0.0.1:43417
I0911 06:55:11.751391  107510 available_controller.go:383] Starting AvailableConditionController
I0911 06:55:11.751411  107510 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0911 06:55:11.751477  107510 apiservice_controller.go:94] Starting APIServiceRegistrationController
I0911 06:55:11.751506  107510 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0911 06:55:11.851599  107510 cache.go:39] Caches are synced for AvailableConditionController controller
I0911 06:55:11.852228  107510 cache.go:39] Caches are synced for APIServiceRegistrationController controller
--- FAIL: TestAggregatedAPIServer (11.86s)
    apiserver_test.go:222: open /tmp/test-integration-wardle-server084719084/apiserver.crt: no such file or directory
    apiserver_test.go:222: open /tmp/test-integration-wardle-server084719084/apiserver.crt: no such file or directory
    apiserver_test.go:222: open /tmp/test-integration-wardle-server084719084/apiserver.crt: no such file or directory
    apiserver_test.go:222: open /tmp/test-integration-wardle-server084719084/apiserver.crt: no such file or directory
    apiserver_test.go:222: open /tmp/test-integration-wardle-server084719084/apiserver.crt: no such file or directory
    apiserver_test.go:222: open /tmp/test-integration-wardle-server084719084/apiserver.crt: no such file or directory
    apiserver_test.go:222: open /tmp/test-integration-wardle-server084719084/apiserver.crt: no such file or directory
    apiserver_test.go:453: {"kind":"APIGroupList","groups":[{"name":"wardle.k8s.io","versions":[{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"},{"groupVersion":"wardle.k8s.io/v1alpha1","version":"v1alpha1"}],"preferredVersion":{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"},"serverAddressByClientCIDRs":[{"clientCIDR":"0.0.0.0/0","serverAddress":":35253"}]}]}
        
    apiserver_test.go:482: {"kind":"APIGroup","apiVersion":"v1","name":"wardle.k8s.io","versions":[{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"},{"groupVersion":"wardle.k8s.io/v1alpha1","version":"v1alpha1"}],"preferredVersion":{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"}}
        
    apiserver_test.go:500: {"kind":"APIResourceList","apiVersion":"v1","groupVersion":"wardle.k8s.io/v1alpha1","resources":[{"name":"fischers","singularName":"","namespaced":false,"kind":"Fischer","verbs":["create","delete","deletecollection","get","list","patch","update","watch"],"storageVersionHash":"u0hTAhBTXHw="},{"name":"flunders","singularName":"","namespaced":true,"kind":"Flunder","verbs":["create","delete","deletecollection","get","list","patch","update","watch"],"storageVersionHash":"k36Bkt6yJrQ="}]}
        
    apiserver_test.go:382: Discovery call expected to return failed unavailable service
    apiserver_test.go:374: Discovery call didn't return expected error: <nil>

				from junit_d965d8661547eb73cabe6d94d5550ec333e4c0fa_20190911-065055.xml

Filter through log files | View test history on testgrid


Show 2863 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 821 lines ...
W0911 06:46:22.074] I0911 06:46:22.068483   53012 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for leases.coordination.k8s.io
W0911 06:46:22.074] I0911 06:46:22.068511   53012 controllermanager.go:534] Started "resourcequota"
W0911 06:46:22.074] I0911 06:46:22.068799   53012 resource_quota_controller.go:271] Starting resource quota controller
W0911 06:46:22.075] I0911 06:46:22.068832   53012 shared_informer.go:197] Waiting for caches to sync for resource quota
W0911 06:46:22.075] I0911 06:46:22.068880   53012 resource_quota_monitor.go:303] QuotaMonitor running
W0911 06:46:22.075] I0911 06:46:22.069045   53012 controllermanager.go:534] Started "job"
W0911 06:46:22.075] E0911 06:46:22.069373   53012 core.go:78] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0911 06:46:22.075] W0911 06:46:22.069388   53012 controllermanager.go:526] Skipping "service"
W0911 06:46:22.076] I0911 06:46:22.069652   53012 controllermanager.go:534] Started "serviceaccount"
W0911 06:46:22.076] I0911 06:46:22.069780   53012 serviceaccounts_controller.go:116] Starting service account controller
W0911 06:46:22.076] W0911 06:46:22.069795   53012 controllermanager.go:513] "bootstrapsigner" is disabled
W0911 06:46:22.076] W0911 06:46:22.069865   53012 controllermanager.go:526] Skipping "nodeipam"
W0911 06:46:22.076] W0911 06:46:22.069913   53012 controllermanager.go:526] Skipping "ttl-after-finished"
... skipping 28 lines ...
W0911 06:46:22.084] I0911 06:46:22.074764   53012 cronjob_controller.go:96] Starting CronJob Manager
W0911 06:46:22.084] I0911 06:46:22.075402   53012 controllermanager.go:534] Started "csrapproving"
W0911 06:46:22.084] I0911 06:46:22.075537   53012 certificate_controller.go:113] Starting certificate controller
W0911 06:46:22.085] I0911 06:46:22.075730   53012 shared_informer.go:197] Waiting for caches to sync for certificate
W0911 06:46:22.085] I0911 06:46:22.080435   53012 controllermanager.go:534] Started "ttl"
W0911 06:46:22.085] I0911 06:46:22.082088   53012 node_lifecycle_controller.go:77] Sending events to api server
W0911 06:46:22.085] E0911 06:46:22.082133   53012 core.go:201] failed to start cloud node lifecycle controller: no cloud provider provided
W0911 06:46:22.086] W0911 06:46:22.082151   53012 controllermanager.go:526] Skipping "cloud-node-lifecycle"
W0911 06:46:22.087] I0911 06:46:22.086313   53012 ttl_controller.go:116] Starting TTL controller
W0911 06:46:22.087] I0911 06:46:22.086351   53012 shared_informer.go:197] Waiting for caches to sync for TTL
W0911 06:46:22.087] I0911 06:46:22.087470   53012 controllermanager.go:534] Started "replicationcontroller"
W0911 06:46:22.088] I0911 06:46:22.088241   53012 controllermanager.go:534] Started "pvc-protection"
W0911 06:46:22.089] W0911 06:46:22.088995   53012 probe.go:268] Flexvolume plugin directory at /usr/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating.
... skipping 57 lines ...
W0911 06:46:22.720] I0911 06:46:22.510475   53012 controllermanager.go:534] Started "garbagecollector"
W0911 06:46:22.720] I0911 06:46:22.510504   53012 shared_informer.go:197] Waiting for caches to sync for garbage collector
W0911 06:46:22.720] I0911 06:46:22.510538   53012 graph_builder.go:282] GraphBuilder running
W0911 06:46:22.720] I0911 06:46:22.511003   53012 controllermanager.go:534] Started "persistentvolume-binder"
W0911 06:46:22.720] I0911 06:46:22.511047   53012 pv_controller_base.go:282] Starting persistent volume controller
W0911 06:46:22.721] I0911 06:46:22.511074   53012 shared_informer.go:197] Waiting for caches to sync for persistent volume
W0911 06:46:22.721] W0911 06:46:22.537423   53012 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0911 06:46:22.721] I0911 06:46:22.573209   53012 shared_informer.go:204] Caches are synced for ClusterRoleAggregator 
W0911 06:46:22.721] I0911 06:46:22.575920   53012 shared_informer.go:204] Caches are synced for certificate 
W0911 06:46:22.721] E0911 06:46:22.580368   53012 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W0911 06:46:22.722] I0911 06:46:22.586541   53012 shared_informer.go:204] Caches are synced for TTL 
W0911 06:46:22.722] I0911 06:46:22.596360   53012 shared_informer.go:204] Caches are synced for PV protection 
W0911 06:46:22.722] I0911 06:46:22.604376   53012 shared_informer.go:204] Caches are synced for namespace 
W0911 06:46:22.722] I0911 06:46:22.673065   53012 shared_informer.go:204] Caches are synced for expand 
I0911 06:46:22.822] Successful: the flag '--client' shows correct client info
I0911 06:46:22.823] (BSuccessful: the flag '--client' correctly has no server version info
... skipping 81 lines ...
I0911 06:46:25.808] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:46:25.811] +++ command: run_RESTMapper_evaluation_tests
I0911 06:46:25.820] +++ [0911 06:46:25] Creating namespace namespace-1568184385-14364
I0911 06:46:25.900] namespace/namespace-1568184385-14364 created
I0911 06:46:25.965] Context "test" modified.
I0911 06:46:25.970] +++ [0911 06:46:25] Testing RESTMapper
I0911 06:46:26.059] +++ [0911 06:46:26] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I0911 06:46:26.072] +++ exit code: 0
I0911 06:46:26.189] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I0911 06:46:26.190] bindings                                                                      true         Binding
I0911 06:46:26.190] componentstatuses                 cs                                          false        ComponentStatus
I0911 06:46:26.190] configmaps                        cm                                          true         ConfigMap
I0911 06:46:26.190] endpoints                         ep                                          true         Endpoints
... skipping 616 lines ...
I0911 06:46:44.037] (Bpoddisruptionbudget.policy/test-pdb-3 created
I0911 06:46:44.120] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I0911 06:46:44.185] (Bpoddisruptionbudget.policy/test-pdb-4 created
I0911 06:46:44.266] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I0911 06:46:44.405] (Bcore.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:46:44.583] (Bpod/env-test-pod created
W0911 06:46:44.684] error: resource(s) were provided, but no name, label selector, or --all flag specified
W0911 06:46:44.684] error: setting 'all' parameter but found a non empty selector. 
W0911 06:46:44.684] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:46:44.685] I0911 06:46:43.728708   49461 controller.go:606] quota admission added evaluator for: poddisruptionbudgets.policy
W0911 06:46:44.685] error: min-available and max-unavailable cannot be both specified
I0911 06:46:44.785] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I0911 06:46:44.785] Name:         env-test-pod
I0911 06:46:44.786] Namespace:    test-kubectl-describe-pod
I0911 06:46:44.786] Priority:     0
I0911 06:46:44.786] Node:         <none>
I0911 06:46:44.786] Labels:       <none>
... skipping 174 lines ...
I0911 06:46:57.697] (Bpod/valid-pod patched
I0911 06:46:57.819] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I0911 06:46:57.897] (Bpod/valid-pod patched
I0911 06:46:57.989] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I0911 06:46:58.149] (Bpod/valid-pod patched
I0911 06:46:58.253] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0911 06:46:58.435] (B+++ [0911 06:46:58] "kubectl patch with resourceVersion 497" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
I0911 06:46:58.701] pod "valid-pod" deleted
I0911 06:46:58.708] pod/valid-pod replaced
I0911 06:46:58.797] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I0911 06:46:58.955] (BSuccessful
I0911 06:46:58.955] message:error: --grace-period must have --force specified
I0911 06:46:58.955] has:\-\-grace-period must have \-\-force specified
I0911 06:46:59.106] Successful
I0911 06:46:59.106] message:error: --timeout must have --force specified
I0911 06:46:59.106] has:\-\-timeout must have \-\-force specified
I0911 06:46:59.241] node/node-v1-test created
W0911 06:46:59.342] W0911 06:46:59.240995   53012 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0911 06:46:59.443] node/node-v1-test replaced
I0911 06:46:59.492] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I0911 06:46:59.569] (Bnode "node-v1-test" deleted
I0911 06:46:59.662] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0911 06:46:59.958] (Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
I0911 06:47:00.927] (Bcore.sh:575: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
... skipping 39 lines ...
I0911 06:47:02.399] (Bpod/redis-master created
I0911 06:47:02.407] pod/valid-pod created
I0911 06:47:02.506] core.sh:614: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: redis-master:valid-pod:
I0911 06:47:02.586] (Bcore.sh:618: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: redis-master:valid-pod:
I0911 06:47:02.660] (Bpod "redis-master" deleted
I0911 06:47:02.665] pod "valid-pod" deleted
W0911 06:47:02.767] error: 'name' already has a value (valid-pod), and --overwrite is false
W0911 06:47:02.769] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0911 06:47:02.871] core.sh:622: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:02.871] (B+++ [0911 06:47:02] Creating namespace namespace-1568184422-24601
I0911 06:47:02.894] namespace/namespace-1568184422-24601 created
I0911 06:47:02.959] Context "test" modified.
I0911 06:47:03.054] core.sh:628: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 68 lines ...
I0911 06:47:08.424] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I0911 06:47:08.427] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:47:08.432] +++ command: run_kubectl_create_error_tests
I0911 06:47:08.440] +++ [0911 06:47:08] Creating namespace namespace-1568184428-17646
I0911 06:47:08.509] namespace/namespace-1568184428-17646 created
I0911 06:47:08.578] Context "test" modified.
I0911 06:47:08.584] +++ [0911 06:47:08] Testing kubectl create with error
W0911 06:47:08.684] Error: must specify one of -f and -k
W0911 06:47:08.685] 
W0911 06:47:08.685] Create a resource from a file or from stdin.
W0911 06:47:08.685] 
W0911 06:47:08.685]  JSON and YAML formats are accepted.
W0911 06:47:08.685] 
W0911 06:47:08.685] Examples:
... skipping 41 lines ...
W0911 06:47:08.691] 
W0911 06:47:08.691] Usage:
W0911 06:47:08.691]   kubectl create -f FILENAME [options]
W0911 06:47:08.691] 
W0911 06:47:08.691] Use "kubectl <command> --help" for more information about a given command.
W0911 06:47:08.691] Use "kubectl options" for a list of global command-line options (applies to all commands).
I0911 06:47:08.792] +++ [0911 06:47:08] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W0911 06:47:08.892] kubectl convert is DEPRECATED and will be removed in a future version.
W0911 06:47:08.893] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0911 06:47:08.993] +++ exit code: 0
I0911 06:47:09.001] Recording: run_kubectl_apply_tests
I0911 06:47:09.001] Running command: run_kubectl_apply_tests
I0911 06:47:09.027] 
... skipping 16 lines ...
I0911 06:47:10.349] apply.sh:276: Successful get pods test-pod {{.metadata.labels.name}}: test-pod-label
I0911 06:47:10.421] (Bpod "test-pod" deleted
I0911 06:47:10.617] customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
W0911 06:47:10.878] I0911 06:47:10.877456   49461 client.go:361] parsed scheme: "endpoint"
W0911 06:47:10.879] I0911 06:47:10.877502   49461 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0911 06:47:10.882] I0911 06:47:10.882078   49461 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
W0911 06:47:10.965] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I0911 06:47:11.066] kind.mygroup.example.com/myobj serverside-applied (server dry run)
I0911 06:47:11.066] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0911 06:47:11.077] +++ exit code: 0
I0911 06:47:11.104] Recording: run_kubectl_run_tests
I0911 06:47:11.104] Running command: run_kubectl_run_tests
I0911 06:47:11.124] 
... skipping 97 lines ...
I0911 06:47:13.378] Context "test" modified.
I0911 06:47:13.383] +++ [0911 06:47:13] Testing kubectl create filter
I0911 06:47:13.457] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:13.600] (Bpod/selector-test-pod created
I0911 06:47:13.683] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I0911 06:47:13.761] (BSuccessful
I0911 06:47:13.762] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I0911 06:47:13.762] has:pods "selector-test-pod-dont-apply" not found
I0911 06:47:13.849] pod "selector-test-pod" deleted
I0911 06:47:13.869] +++ exit code: 0
I0911 06:47:13.898] Recording: run_kubectl_apply_deployments_tests
I0911 06:47:13.899] Running command: run_kubectl_apply_deployments_tests
I0911 06:47:13.919] 
... skipping 25 lines ...
I0911 06:47:15.552] (Bapps.sh:139: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:15.619] (Bapps.sh:140: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:15.698] (Bapps.sh:144: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:15.860] (Bdeployment.apps/nginx created
I0911 06:47:15.949] apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
I0911 06:47:20.127] (BSuccessful
I0911 06:47:20.128] message:Error from server (Conflict): error when applying patch:
I0911 06:47:20.128] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1568184433-4659\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I0911 06:47:20.129] to:
I0911 06:47:20.130] Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
I0911 06:47:20.132] Name: "nginx", Namespace: "namespace-1568184433-4659"
I0911 06:47:20.134] Object: &{map["apiVersion":"apps/v1" "kind":"Deployment" "metadata":map["annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1568184433-4659\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx1\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "creationTimestamp":"2019-09-11T06:47:15Z" "generation":'\x01' "labels":map["name":"nginx"] "name":"nginx" "namespace":"namespace-1568184433-4659" "resourceVersion":"589" "selfLink":"/apis/apps/v1/namespaces/namespace-1568184433-4659/deployments/nginx" "uid":"70a94698-f75d-43a0-aab2-52a8e65acf80"] "spec":map["progressDeadlineSeconds":'\u0258' "replicas":'\x03' "revisionHistoryLimit":'\n' "selector":map["matchLabels":map["name":"nginx1"]] "strategy":map["rollingUpdate":map["maxSurge":"25%" "maxUnavailable":"25%"] "type":"RollingUpdate"] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "imagePullPolicy":"IfNotPresent" "name":"nginx" "ports":[map["containerPort":'P' "protocol":"TCP"]] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File"]] "dnsPolicy":"ClusterFirst" "restartPolicy":"Always" "schedulerName":"default-scheduler" "securityContext":map[] "terminationGracePeriodSeconds":'\x1e']]] "status":map["conditions":[map["lastTransitionTime":"2019-09-11T06:47:15Z" "lastUpdateTime":"2019-09-11T06:47:15Z" "message":"Deployment does not have minimum availability." "reason":"MinimumReplicasUnavailable" "status":"False" "type":"Available"] map["lastTransitionTime":"2019-09-11T06:47:15Z" "lastUpdateTime":"2019-09-11T06:47:15Z" "message":"ReplicaSet \"nginx-8484dd655\" is progressing." "reason":"ReplicaSetUpdated" "status":"True" "type":"Progressing"]] "observedGeneration":'\x01' "replicas":'\x03' "unavailableReplicas":'\x03' "updatedReplicas":'\x03']]}
I0911 06:47:20.135] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
I0911 06:47:20.135] has:Error from server (Conflict)
W0911 06:47:20.236] I0911 06:47:15.863189   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184433-4659", Name:"nginx", UID:"70a94698-f75d-43a0-aab2-52a8e65acf80", APIVersion:"apps/v1", ResourceVersion:"576", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-8484dd655 to 3
W0911 06:47:20.236] I0911 06:47:15.865520   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184433-4659", Name:"nginx-8484dd655", UID:"180b226f-a8eb-4f93-aad8-3010e4e67aec", APIVersion:"apps/v1", ResourceVersion:"577", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-5psgq
W0911 06:47:20.236] I0911 06:47:15.870134   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184433-4659", Name:"nginx-8484dd655", UID:"180b226f-a8eb-4f93-aad8-3010e4e67aec", APIVersion:"apps/v1", ResourceVersion:"577", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-n62xb
W0911 06:47:20.237] I0911 06:47:15.871328   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184433-4659", Name:"nginx-8484dd655", UID:"180b226f-a8eb-4f93-aad8-3010e4e67aec", APIVersion:"apps/v1", ResourceVersion:"577", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-mj94j
W0911 06:47:22.854] I0911 06:47:22.853649   53012 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1568184426-8467
W0911 06:47:24.534] E0911 06:47:24.533821   53012 replica_set.go:450] Sync "namespace-1568184433-4659/nginx-8484dd655" failed with Operation cannot be fulfilled on replicasets.apps "nginx-8484dd655": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1568184433-4659/nginx-8484dd655, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 180b226f-a8eb-4f93-aad8-3010e4e67aec, UID in object meta: 
I0911 06:47:25.313] deployment.apps/nginx configured
I0911 06:47:25.400] Successful
I0911 06:47:25.400] message:        "name": "nginx2"
I0911 06:47:25.401]           "name": "nginx2"
I0911 06:47:25.401] has:"name": "nginx2"
W0911 06:47:25.501] I0911 06:47:25.315772   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184433-4659", Name:"nginx", UID:"69b650c7-d6bd-4bfb-b257-af117be7b57d", APIVersion:"apps/v1", ResourceVersion:"612", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
... skipping 141 lines ...
I0911 06:47:32.656] +++ [0911 06:47:32] Creating namespace namespace-1568184452-25099
I0911 06:47:32.731] namespace/namespace-1568184452-25099 created
I0911 06:47:32.798] Context "test" modified.
I0911 06:47:32.804] +++ [0911 06:47:32] Testing kubectl get
I0911 06:47:32.893] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:32.997] (BSuccessful
I0911 06:47:32.998] message:Error from server (NotFound): pods "abc" not found
I0911 06:47:32.998] has:pods "abc" not found
I0911 06:47:33.088] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:33.182] (BSuccessful
I0911 06:47:33.182] message:Error from server (NotFound): pods "abc" not found
I0911 06:47:33.182] has:pods "abc" not found
I0911 06:47:33.292] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:33.376] (BSuccessful
I0911 06:47:33.376] message:{
I0911 06:47:33.377]     "apiVersion": "v1",
I0911 06:47:33.377]     "items": [],
... skipping 23 lines ...
I0911 06:47:33.752] has not:No resources found
I0911 06:47:33.841] Successful
I0911 06:47:33.841] message:NAME
I0911 06:47:33.841] has not:No resources found
I0911 06:47:33.942] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:34.036] (BSuccessful
I0911 06:47:34.036] message:error: the server doesn't have a resource type "foobar"
I0911 06:47:34.036] has not:No resources found
I0911 06:47:34.112] Successful
I0911 06:47:34.112] message:No resources found in namespace-1568184452-25099 namespace.
I0911 06:47:34.112] has:No resources found
I0911 06:47:34.190] Successful
I0911 06:47:34.190] message:
I0911 06:47:34.190] has not:No resources found
I0911 06:47:34.266] Successful
I0911 06:47:34.267] message:No resources found in namespace-1568184452-25099 namespace.
I0911 06:47:34.267] has:No resources found
I0911 06:47:34.341] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:34.425] (BSuccessful
I0911 06:47:34.425] message:Error from server (NotFound): pods "abc" not found
I0911 06:47:34.425] has:pods "abc" not found
I0911 06:47:34.427] FAIL!
I0911 06:47:34.427] message:Error from server (NotFound): pods "abc" not found
I0911 06:47:34.427] has not:List
I0911 06:47:34.428] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
I0911 06:47:34.548] Successful
I0911 06:47:34.548] message:I0911 06:47:34.504629   62984 loader.go:375] Config loaded from file:  /tmp/tmp.m9fj6vVh1G/.kube/config
I0911 06:47:34.548] I0911 06:47:34.506013   62984 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 0 milliseconds
I0911 06:47:34.548] I0911 06:47:34.526723   62984 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
... skipping 660 lines ...
I0911 06:47:40.123] Successful
I0911 06:47:40.123] message:NAME    DATA   AGE
I0911 06:47:40.123] one     0      1s
I0911 06:47:40.123] three   0      0s
I0911 06:47:40.123] two     0      1s
I0911 06:47:40.124] STATUS    REASON          MESSAGE
I0911 06:47:40.124] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0911 06:47:40.124] has not:watch is only supported on individual resources
I0911 06:47:41.203] Successful
I0911 06:47:41.203] message:STATUS    REASON          MESSAGE
I0911 06:47:41.204] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0911 06:47:41.204] has not:watch is only supported on individual resources
I0911 06:47:41.208] +++ [0911 06:47:41] Creating namespace namespace-1568184461-26388
I0911 06:47:41.296] namespace/namespace-1568184461-26388 created
I0911 06:47:41.379] Context "test" modified.
I0911 06:47:41.465] get.sh:157: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:41.629] (Bpod/valid-pod created
... skipping 56 lines ...
I0911 06:47:41.737] }
I0911 06:47:41.813] get.sh:162: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0911 06:47:42.046] (B<no value>Successful
I0911 06:47:42.046] message:valid-pod:
I0911 06:47:42.046] has:valid-pod:
I0911 06:47:42.125] Successful
I0911 06:47:42.125] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I0911 06:47:42.126] 	template was:
I0911 06:47:42.126] 		{.missing}
I0911 06:47:42.127] 	object given to jsonpath engine was:
I0911 06:47:42.128] 		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2019-09-11T06:47:41Z", "labels":map[string]interface {}{"name":"valid-pod"}, "name":"valid-pod", "namespace":"namespace-1568184461-26388", "resourceVersion":"688", "selfLink":"/api/v1/namespaces/namespace-1568184461-26388/pods/valid-pod", "uid":"c06f54aa-0d9a-4b9e-981d-1dd528a64b5c"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
I0911 06:47:42.129] has:missing is not found
I0911 06:47:42.204] Successful
I0911 06:47:42.205] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I0911 06:47:42.205] 	template was:
I0911 06:47:42.205] 		{{.missing}}
I0911 06:47:42.206] 	raw data was:
I0911 06:47:42.206] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-09-11T06:47:41Z","labels":{"name":"valid-pod"},"name":"valid-pod","namespace":"namespace-1568184461-26388","resourceVersion":"688","selfLink":"/api/v1/namespaces/namespace-1568184461-26388/pods/valid-pod","uid":"c06f54aa-0d9a-4b9e-981d-1dd528a64b5c"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I0911 06:47:42.207] 	object given to template engine was:
I0911 06:47:42.207] 		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2019-09-11T06:47:41Z labels:map[name:valid-pod] name:valid-pod namespace:namespace-1568184461-26388 resourceVersion:688 selfLink:/api/v1/namespaces/namespace-1568184461-26388/pods/valid-pod uid:c06f54aa-0d9a-4b9e-981d-1dd528a64b5c] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
I0911 06:47:42.207] has:map has no entry for key "missing"
W0911 06:47:42.308] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
I0911 06:47:43.290] Successful
I0911 06:47:43.290] message:NAME        READY   STATUS    RESTARTS   AGE
I0911 06:47:43.290] valid-pod   0/1     Pending   0          1s
I0911 06:47:43.290] STATUS      REASON          MESSAGE
I0911 06:47:43.291] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0911 06:47:43.291] has:STATUS
I0911 06:47:43.291] Successful
I0911 06:47:43.292] message:NAME        READY   STATUS    RESTARTS   AGE
I0911 06:47:43.292] valid-pod   0/1     Pending   0          1s
I0911 06:47:43.292] STATUS      REASON          MESSAGE
I0911 06:47:43.292] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0911 06:47:43.292] has:valid-pod
I0911 06:47:44.363] Successful
I0911 06:47:44.364] message:pod/valid-pod
I0911 06:47:44.364] has not:STATUS
I0911 06:47:44.365] Successful
I0911 06:47:44.365] message:pod/valid-pod
... skipping 72 lines ...
I0911 06:47:45.460] status:
I0911 06:47:45.461]   phase: Pending
I0911 06:47:45.461]   qosClass: Guaranteed
I0911 06:47:45.461] ---
I0911 06:47:45.461] has:name: valid-pod
I0911 06:47:45.524] Successful
I0911 06:47:45.524] message:Error from server (NotFound): pods "invalid-pod" not found
I0911 06:47:45.524] has:"invalid-pod" not found
I0911 06:47:45.611] pod "valid-pod" deleted
I0911 06:47:45.704] get.sh:200: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:47:45.844] (Bpod/redis-master created
I0911 06:47:45.852] pod/valid-pod created
I0911 06:47:45.941] Successful
... skipping 35 lines ...
I0911 06:47:47.047] +++ command: run_kubectl_exec_pod_tests
I0911 06:47:47.056] +++ [0911 06:47:47] Creating namespace namespace-1568184467-4615
I0911 06:47:47.127] namespace/namespace-1568184467-4615 created
I0911 06:47:47.192] Context "test" modified.
I0911 06:47:47.198] +++ [0911 06:47:47] Testing kubectl exec POD COMMAND
I0911 06:47:47.273] Successful
I0911 06:47:47.274] message:Error from server (NotFound): pods "abc" not found
I0911 06:47:47.274] has:pods "abc" not found
I0911 06:47:47.428] pod/test-pod created
I0911 06:47:47.522] Successful
I0911 06:47:47.523] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0911 06:47:47.523] has not:pods "test-pod" not found
I0911 06:47:47.524] Successful
I0911 06:47:47.524] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0911 06:47:47.524] has not:pod or type/name must be specified
I0911 06:47:47.598] pod "test-pod" deleted
I0911 06:47:47.616] +++ exit code: 0
I0911 06:47:47.645] Recording: run_kubectl_exec_resource_name_tests
I0911 06:47:47.646] Running command: run_kubectl_exec_resource_name_tests
I0911 06:47:47.664] 
... skipping 2 lines ...
I0911 06:47:47.671] +++ command: run_kubectl_exec_resource_name_tests
I0911 06:47:47.680] +++ [0911 06:47:47] Creating namespace namespace-1568184467-26500
I0911 06:47:47.744] namespace/namespace-1568184467-26500 created
I0911 06:47:47.845] Context "test" modified.
I0911 06:47:47.852] +++ [0911 06:47:47] Testing kubectl exec TYPE/NAME COMMAND
I0911 06:47:47.958] Successful
I0911 06:47:47.958] message:error: the server doesn't have a resource type "foo"
I0911 06:47:47.958] has:error:
I0911 06:47:48.065] Successful
I0911 06:47:48.066] message:Error from server (NotFound): deployments.apps "bar" not found
I0911 06:47:48.066] has:"bar" not found
I0911 06:47:48.257] pod/test-pod created
I0911 06:47:48.428] replicaset.apps/frontend created
W0911 06:47:48.529] I0911 06:47:48.433937   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184467-26500", Name:"frontend", UID:"94792389-b924-42d4-9141-ed45dca32bf0", APIVersion:"apps/v1", ResourceVersion:"742", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-z6sxk
W0911 06:47:48.529] I0911 06:47:48.437786   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184467-26500", Name:"frontend", UID:"94792389-b924-42d4-9141-ed45dca32bf0", APIVersion:"apps/v1", ResourceVersion:"742", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rktvk
W0911 06:47:48.530] I0911 06:47:48.438258   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184467-26500", Name:"frontend", UID:"94792389-b924-42d4-9141-ed45dca32bf0", APIVersion:"apps/v1", ResourceVersion:"742", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zvx48
I0911 06:47:48.630] configmap/test-set-env-config created
I0911 06:47:48.683] Successful
I0911 06:47:48.683] message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
I0911 06:47:48.684] has:not implemented
I0911 06:47:48.768] Successful
I0911 06:47:48.769] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0911 06:47:48.769] has not:not found
I0911 06:47:48.770] Successful
I0911 06:47:48.770] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0911 06:47:48.771] has not:pod or type/name must be specified
I0911 06:47:48.860] Successful
I0911 06:47:48.860] message:Error from server (BadRequest): pod frontend-rktvk does not have a host assigned
I0911 06:47:48.860] has not:not found
I0911 06:47:48.863] Successful
I0911 06:47:48.863] message:Error from server (BadRequest): pod frontend-rktvk does not have a host assigned
I0911 06:47:48.864] has not:pod or type/name must be specified
I0911 06:47:48.932] pod "test-pod" deleted
I0911 06:47:49.002] replicaset.apps "frontend" deleted
I0911 06:47:49.075] configmap "test-set-env-config" deleted
I0911 06:47:49.091] +++ exit code: 0
I0911 06:47:49.119] Recording: run_create_secret_tests
I0911 06:47:49.119] Running command: run_create_secret_tests
I0911 06:47:49.138] 
I0911 06:47:49.140] +++ Running case: test-cmd.run_create_secret_tests 
I0911 06:47:49.142] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:47:49.144] +++ command: run_create_secret_tests
I0911 06:47:49.233] Successful
I0911 06:47:49.234] message:Error from server (NotFound): secrets "mysecret" not found
I0911 06:47:49.234] has:secrets "mysecret" not found
I0911 06:47:49.373] Successful
I0911 06:47:49.373] message:Error from server (NotFound): secrets "mysecret" not found
I0911 06:47:49.374] has:secrets "mysecret" not found
I0911 06:47:49.374] Successful
I0911 06:47:49.375] message:user-specified
I0911 06:47:49.375] has:user-specified
I0911 06:47:49.439] Successful
I0911 06:47:49.511] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"fd89e2b2-a0c1-4526-ade7-da3437ca5628","resourceVersion":"762","creationTimestamp":"2019-09-11T06:47:49Z"}}
... skipping 2 lines ...
I0911 06:47:49.709] has:uid
I0911 06:47:49.789] Successful
I0911 06:47:49.790] message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"fd89e2b2-a0c1-4526-ade7-da3437ca5628","resourceVersion":"763","creationTimestamp":"2019-09-11T06:47:49Z"},"data":{"key1":"config1"}}
I0911 06:47:49.790] has:config1
I0911 06:47:49.855] {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"fd89e2b2-a0c1-4526-ade7-da3437ca5628"}}
I0911 06:47:49.941] Successful
I0911 06:47:49.942] message:Error from server (NotFound): configmaps "tester-update-cm" not found
I0911 06:47:49.942] has:configmaps "tester-update-cm" not found
I0911 06:47:49.954] +++ exit code: 0
I0911 06:47:49.981] Recording: run_kubectl_create_kustomization_directory_tests
I0911 06:47:49.981] Running command: run_kubectl_create_kustomization_directory_tests
I0911 06:47:50.000] 
I0911 06:47:50.002] +++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 110 lines ...
W0911 06:47:52.530] I0911 06:47:50.464624   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184467-26500", Name:"test-the-deployment-69fdbb5f7d", UID:"3fc78b7e-0725-4587-a65b-bc1783559e85", APIVersion:"apps/v1", ResourceVersion:"772", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-69fdbb5f7d-hlfqn
W0911 06:47:52.530] I0911 06:47:50.465186   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184467-26500", Name:"test-the-deployment-69fdbb5f7d", UID:"3fc78b7e-0725-4587-a65b-bc1783559e85", APIVersion:"apps/v1", ResourceVersion:"772", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-69fdbb5f7d-7q8qj
I0911 06:47:53.511] Successful
I0911 06:47:53.512] message:NAME        READY   STATUS    RESTARTS   AGE
I0911 06:47:53.512] valid-pod   0/1     Pending   0          0s
I0911 06:47:53.512] STATUS      REASON          MESSAGE
I0911 06:47:53.512] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0911 06:47:53.512] has:Timeout exceeded while reading body
I0911 06:47:53.588] Successful
I0911 06:47:53.588] message:NAME        READY   STATUS    RESTARTS   AGE
I0911 06:47:53.588] valid-pod   0/1     Pending   0          1s
I0911 06:47:53.588] has:valid-pod
I0911 06:47:53.654] Successful
I0911 06:47:53.655] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I0911 06:47:53.655] has:Invalid timeout value
I0911 06:47:53.733] pod "valid-pod" deleted
I0911 06:47:53.751] +++ exit code: 0
I0911 06:47:53.780] Recording: run_crd_tests
I0911 06:47:53.780] Running command: run_crd_tests
I0911 06:47:53.803] 
... skipping 163 lines ...
W0911 06:47:58.391] I0911 06:47:56.559523   49461 controller.go:606] quota admission added evaluator for: foos.company.com
I0911 06:47:58.492] crd.sh:236: Successful get foos/test {{.patched}}: value1
I0911 06:47:58.493] (Bfoo.company.com/test patched
I0911 06:47:58.622] crd.sh:238: Successful get foos/test {{.patched}}: value2
I0911 06:47:58.716] (Bfoo.company.com/test patched
I0911 06:47:58.806] crd.sh:240: Successful get foos/test {{.patched}}: <no value>
I0911 06:47:58.970] (B+++ [0911 06:47:58] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I0911 06:47:59.040] {
I0911 06:47:59.040]     "apiVersion": "company.com/v1",
I0911 06:47:59.040]     "kind": "Foo",
I0911 06:47:59.041]     "metadata": {
I0911 06:47:59.041]         "annotations": {
I0911 06:47:59.041]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 184 lines ...
I0911 06:48:07.415] (Bnamespace "non-native-resources" deleted
W0911 06:48:12.473] I0911 06:48:12.472491   49461 client.go:361] parsed scheme: "endpoint"
W0911 06:48:12.473] I0911 06:48:12.472559   49461 endpoint.go:66] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0911 06:48:12.672] crd.sh:458: Successful get bars {{len .items}}: 0
I0911 06:48:12.842] (Bcustomresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
I0911 06:48:12.942] customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
W0911 06:48:13.043] Error from server (NotFound): namespaces "non-native-resources" not found
I0911 06:48:13.143] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0911 06:48:13.147] customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
I0911 06:48:13.177] +++ exit code: 0
I0911 06:48:13.207] Recording: run_cmd_with_img_tests
I0911 06:48:13.208] Running command: run_cmd_with_img_tests
I0911 06:48:13.228] 
... skipping 6 lines ...
I0911 06:48:13.378] +++ [0911 06:48:13] Testing cmd with image
I0911 06:48:13.460] Successful
I0911 06:48:13.461] message:deployment.apps/test1 created
I0911 06:48:13.461] has:deployment.apps/test1 created
I0911 06:48:13.530] deployment.apps "test1" deleted
I0911 06:48:13.607] Successful
I0911 06:48:13.607] message:error: Invalid image name "InvalidImageName": invalid reference format
I0911 06:48:13.608] has:error: Invalid image name "InvalidImageName": invalid reference format
I0911 06:48:13.620] +++ exit code: 0
I0911 06:48:13.652] +++ [0911 06:48:13] Testing recursive resources
I0911 06:48:13.657] +++ [0911 06:48:13] Creating namespace namespace-1568184493-10784
I0911 06:48:13.732] namespace/namespace-1568184493-10784 created
I0911 06:48:13.806] Context "test" modified.
I0911 06:48:13.895] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:14.170] (Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:14.172] (BSuccessful
I0911 06:48:14.172] message:pod/busybox0 created
I0911 06:48:14.172] pod/busybox1 created
I0911 06:48:14.173] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0911 06:48:14.173] has:error validating data: kind not set
I0911 06:48:14.254] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:14.465] (Bgeneric-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I0911 06:48:14.468] (BSuccessful
I0911 06:48:14.468] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:14.468] has:Object 'Kind' is missing
I0911 06:48:14.567] generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:14.832] (Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0911 06:48:14.834] (BSuccessful
I0911 06:48:14.835] message:pod/busybox0 replaced
I0911 06:48:14.835] pod/busybox1 replaced
I0911 06:48:14.835] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0911 06:48:14.835] has:error validating data: kind not set
I0911 06:48:14.916] generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:14.999] (BSuccessful
I0911 06:48:14.999] message:Name:         busybox0
I0911 06:48:14.999] Namespace:    namespace-1568184493-10784
I0911 06:48:15.000] Priority:     0
I0911 06:48:15.000] Node:         <none>
... skipping 159 lines ...
I0911 06:48:15.015] has:Object 'Kind' is missing
I0911 06:48:15.080] generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:15.235] (Bgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I0911 06:48:15.238] (BSuccessful
I0911 06:48:15.239] message:pod/busybox0 annotated
I0911 06:48:15.239] pod/busybox1 annotated
I0911 06:48:15.239] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:15.239] has:Object 'Kind' is missing
I0911 06:48:15.316] generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:15.544] (Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0911 06:48:15.546] (BSuccessful
I0911 06:48:15.547] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0911 06:48:15.547] pod/busybox0 configured
I0911 06:48:15.547] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0911 06:48:15.548] pod/busybox1 configured
I0911 06:48:15.548] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0911 06:48:15.548] has:error validating data: kind not set
I0911 06:48:15.629] generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:15.787] (Bdeployment.apps/nginx created
I0911 06:48:15.875] generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0911 06:48:15.957] (Bgeneric-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0911 06:48:16.129] (Bgeneric-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
I0911 06:48:16.132] (BSuccessful
... skipping 41 lines ...
I0911 06:48:16.139] has:extensions/v1beta1
I0911 06:48:16.207] deployment.apps "nginx" deleted
W0911 06:48:16.307] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0911 06:48:16.308] I0911 06:48:13.450451   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184493-1000", Name:"test1", UID:"0700f123-60a7-420f-ab81-aae0217e1bde", APIVersion:"apps/v1", ResourceVersion:"909", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-6cdffdb5b8 to 1
W0911 06:48:16.309] I0911 06:48:13.454340   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-1000", Name:"test1-6cdffdb5b8", UID:"31f71f7f-ed98-4175-ba4c-f7bc91bccfe5", APIVersion:"apps/v1", ResourceVersion:"910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-vmx7r
W0911 06:48:16.309] W0911 06:48:13.850288   49461 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
W0911 06:48:16.310] E0911 06:48:13.851907   53012 reflector.go:280] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:16.310] W0911 06:48:13.960159   49461 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
W0911 06:48:16.310] E0911 06:48:13.961524   53012 reflector.go:280] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:16.311] E0911 06:48:14.853238   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:16.312] E0911 06:48:14.962896   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:16.312] I0911 06:48:15.791382   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184493-10784", Name:"nginx", UID:"efbac411-6b53-4356-bc9e-3ec74eb6cb22", APIVersion:"apps/v1", ResourceVersion:"935", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
W0911 06:48:16.313] I0911 06:48:15.793902   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-10784", Name:"nginx-f87d999f7", UID:"4b61f09a-b098-481e-8bb9-0ad58fd77dcb", APIVersion:"apps/v1", ResourceVersion:"936", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-hv55t
W0911 06:48:16.314] I0911 06:48:15.796242   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-10784", Name:"nginx-f87d999f7", UID:"4b61f09a-b098-481e-8bb9-0ad58fd77dcb", APIVersion:"apps/v1", ResourceVersion:"936", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-szqtp
W0911 06:48:16.314] I0911 06:48:15.798020   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-10784", Name:"nginx-f87d999f7", UID:"4b61f09a-b098-481e-8bb9-0ad58fd77dcb", APIVersion:"apps/v1", ResourceVersion:"936", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-fm47d
W0911 06:48:16.314] E0911 06:48:15.854397   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:16.315] E0911 06:48:15.963975   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:16.315] kubectl convert is DEPRECATED and will be removed in a future version.
W0911 06:48:16.315] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0911 06:48:16.416] generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:16.482] (Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:16.485] (BSuccessful
I0911 06:48:16.485] message:kubectl convert is DEPRECATED and will be removed in a future version.
I0911 06:48:16.485] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0911 06:48:16.486] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:16.486] has:Object 'Kind' is missing
I0911 06:48:16.567] generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:16.650] (BSuccessful
I0911 06:48:16.651] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:16.651] has:busybox0:busybox1:
I0911 06:48:16.652] Successful
I0911 06:48:16.653] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:16.653] has:Object 'Kind' is missing
I0911 06:48:16.731] generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:16.806] (Bpod/busybox0 labeled
I0911 06:48:16.806] pod/busybox1 labeled
I0911 06:48:16.807] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:16.897] generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I0911 06:48:16.899] (BSuccessful
I0911 06:48:16.899] message:pod/busybox0 labeled
I0911 06:48:16.899] pod/busybox1 labeled
I0911 06:48:16.900] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:16.900] has:Object 'Kind' is missing
I0911 06:48:16.984] generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:17.072] (Bpod/busybox0 patched
I0911 06:48:17.073] pod/busybox1 patched
I0911 06:48:17.073] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:17.169] generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I0911 06:48:17.172] (BSuccessful
I0911 06:48:17.172] message:pod/busybox0 patched
I0911 06:48:17.172] pod/busybox1 patched
I0911 06:48:17.173] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:17.173] has:Object 'Kind' is missing
I0911 06:48:17.266] generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:17.440] (Bgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:17.442] (BSuccessful
I0911 06:48:17.442] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0911 06:48:17.442] pod "busybox0" force deleted
I0911 06:48:17.442] pod "busybox1" force deleted
I0911 06:48:17.443] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0911 06:48:17.443] has:Object 'Kind' is missing
I0911 06:48:17.525] generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:17.670] (Breplicationcontroller/busybox0 created
I0911 06:48:17.676] replicationcontroller/busybox1 created
I0911 06:48:17.772] generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:17.859] (Bgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:17.940] (Bgeneric-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
I0911 06:48:18.044] (Bgeneric-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
I0911 06:48:18.210] (Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0911 06:48:18.299] (Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0911 06:48:18.301] (BSuccessful
I0911 06:48:18.301] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I0911 06:48:18.301] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I0911 06:48:18.302] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:18.302] has:Object 'Kind' is missing
I0911 06:48:18.379] horizontalpodautoscaler.autoscaling "busybox0" deleted
I0911 06:48:18.468] horizontalpodautoscaler.autoscaling "busybox1" deleted
W0911 06:48:18.569] E0911 06:48:16.855756   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:18.570] E0911 06:48:16.965350   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:18.570] I0911 06:48:17.558386   53012 namespace_controller.go:171] Namespace has been deleted non-native-resources
W0911 06:48:18.570] I0911 06:48:17.673436   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184493-10784", Name:"busybox0", UID:"6b0bba91-ff5a-41a8-8569-d63152a01635", APIVersion:"v1", ResourceVersion:"966", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-b8xj9
W0911 06:48:18.571] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0911 06:48:18.571] I0911 06:48:17.680438   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184493-10784", Name:"busybox1", UID:"cd698fba-b3b5-4e61-8c0b-0182cfc725cf", APIVersion:"v1", ResourceVersion:"968", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-hkcj7
W0911 06:48:18.572] E0911 06:48:17.857057   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:18.572] E0911 06:48:17.967243   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:18.673] generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:18.680] (Bgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
I0911 06:48:18.766] (Bgeneric-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
I0911 06:48:18.934] (Bgeneric-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0911 06:48:19.010] (Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0911 06:48:19.011] (BSuccessful
I0911 06:48:19.012] message:service/busybox0 exposed
I0911 06:48:19.012] service/busybox1 exposed
I0911 06:48:19.012] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:19.012] has:Object 'Kind' is missing
I0911 06:48:19.094] generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:19.178] (Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
I0911 06:48:19.260] (Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
I0911 06:48:19.470] (Bgeneric-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
I0911 06:48:19.554] (Bgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
I0911 06:48:19.557] (BSuccessful
I0911 06:48:19.557] message:replicationcontroller/busybox0 scaled
I0911 06:48:19.557] replicationcontroller/busybox1 scaled
I0911 06:48:19.558] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:19.558] has:Object 'Kind' is missing
I0911 06:48:19.639] generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:19.797] (Bgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:19.799] (BSuccessful
I0911 06:48:19.800] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0911 06:48:19.800] replicationcontroller "busybox0" force deleted
I0911 06:48:19.800] replicationcontroller "busybox1" force deleted
I0911 06:48:19.800] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:19.800] has:Object 'Kind' is missing
I0911 06:48:19.873] generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:20.000] (Bdeployment.apps/nginx1-deployment created
I0911 06:48:20.005] deployment.apps/nginx0-deployment created
I0911 06:48:20.098] generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I0911 06:48:20.182] (Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0911 06:48:20.396] (Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0911 06:48:20.398] (BSuccessful
I0911 06:48:20.399] message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
I0911 06:48:20.399] deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
I0911 06:48:20.400] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0911 06:48:20.400] has:Object 'Kind' is missing
I0911 06:48:20.500] deployment.apps/nginx1-deployment paused
I0911 06:48:20.502] deployment.apps/nginx0-deployment paused
I0911 06:48:20.596] generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I0911 06:48:20.598] (BSuccessful
I0911 06:48:20.599] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
I0911 06:48:20.872] 1         <none>
I0911 06:48:20.872] 
I0911 06:48:20.873] deployment.apps/nginx0-deployment 
I0911 06:48:20.873] REVISION  CHANGE-CAUSE
I0911 06:48:20.873] 1         <none>
I0911 06:48:20.873] 
I0911 06:48:20.873] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0911 06:48:20.873] has:nginx0-deployment
I0911 06:48:20.874] Successful
I0911 06:48:20.874] message:deployment.apps/nginx1-deployment 
I0911 06:48:20.874] REVISION  CHANGE-CAUSE
I0911 06:48:20.874] 1         <none>
I0911 06:48:20.874] 
I0911 06:48:20.874] deployment.apps/nginx0-deployment 
I0911 06:48:20.874] REVISION  CHANGE-CAUSE
I0911 06:48:20.875] 1         <none>
I0911 06:48:20.875] 
I0911 06:48:20.875] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0911 06:48:20.875] has:nginx1-deployment
I0911 06:48:20.876] Successful
I0911 06:48:20.876] message:deployment.apps/nginx1-deployment 
I0911 06:48:20.876] REVISION  CHANGE-CAUSE
I0911 06:48:20.876] 1         <none>
I0911 06:48:20.876] 
I0911 06:48:20.877] deployment.apps/nginx0-deployment 
I0911 06:48:20.877] REVISION  CHANGE-CAUSE
I0911 06:48:20.877] 1         <none>
I0911 06:48:20.877] 
I0911 06:48:20.877] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0911 06:48:20.878] has:Object 'Kind' is missing
I0911 06:48:20.943] deployment.apps "nginx1-deployment" force deleted
I0911 06:48:20.948] deployment.apps "nginx0-deployment" force deleted
W0911 06:48:21.048] E0911 06:48:18.858026   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:21.049] E0911 06:48:18.968269   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:21.049] I0911 06:48:19.339449   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184493-10784", Name:"busybox0", UID:"6b0bba91-ff5a-41a8-8569-d63152a01635", APIVersion:"v1", ResourceVersion:"987", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-8w78z
W0911 06:48:21.050] I0911 06:48:19.347600   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184493-10784", Name:"busybox1", UID:"cd698fba-b3b5-4e61-8c0b-0182cfc725cf", APIVersion:"v1", ResourceVersion:"991", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-mldrh
W0911 06:48:21.050] E0911 06:48:19.859168   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:21.050] E0911 06:48:19.969498   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:21.051] I0911 06:48:20.003536   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184493-10784", Name:"nginx1-deployment", UID:"3c2d24ac-4e5d-47b4-ab09-2e83a41dca50", APIVersion:"apps/v1", ResourceVersion:"1007", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
W0911 06:48:21.051] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0911 06:48:21.051] I0911 06:48:20.006357   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184493-10784", Name:"nginx0-deployment", UID:"c5086c9d-ad0a-49cb-b067-1cb226730c8c", APIVersion:"apps/v1", ResourceVersion:"1009", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
W0911 06:48:21.052] I0911 06:48:20.006518   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-10784", Name:"nginx1-deployment-7bdbbfb5cf", UID:"2f38d75e-1f31-4c18-8f28-320879f75cef", APIVersion:"apps/v1", ResourceVersion:"1008", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-2qmhg
W0911 06:48:21.052] I0911 06:48:20.009687   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-10784", Name:"nginx1-deployment-7bdbbfb5cf", UID:"2f38d75e-1f31-4c18-8f28-320879f75cef", APIVersion:"apps/v1", ResourceVersion:"1008", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-6h76l
W0911 06:48:21.052] I0911 06:48:20.010115   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-10784", Name:"nginx0-deployment-57c6bff7f6", UID:"ab46ecaa-5a13-4046-838b-4866412fe33b", APIVersion:"apps/v1", ResourceVersion:"1012", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-hqvzf
W0911 06:48:21.053] I0911 06:48:20.012690   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184493-10784", Name:"nginx0-deployment-57c6bff7f6", UID:"ab46ecaa-5a13-4046-838b-4866412fe33b", APIVersion:"apps/v1", ResourceVersion:"1012", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-wzfts
W0911 06:48:21.053] E0911 06:48:20.860115   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:21.053] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:48:21.054] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
W0911 06:48:21.054] E0911 06:48:20.970436   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:21.861] E0911 06:48:21.861312   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:21.972] E0911 06:48:21.971648   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:22.072] generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:22.174] (Breplicationcontroller/busybox0 created
I0911 06:48:22.177] replicationcontroller/busybox1 created
I0911 06:48:22.273] generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0911 06:48:22.356] (BSuccessful
I0911 06:48:22.357] message:no rollbacker has been implemented for "ReplicationController"
... skipping 2 lines ...
I0911 06:48:22.358] has:no rollbacker has been implemented for "ReplicationController"
I0911 06:48:22.359] Successful
I0911 06:48:22.360] message:no rollbacker has been implemented for "ReplicationController"
I0911 06:48:22.360] no rollbacker has been implemented for "ReplicationController"
I0911 06:48:22.361] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:22.361] has:Object 'Kind' is missing
W0911 06:48:22.462] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0911 06:48:22.463] I0911 06:48:22.177427   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184493-10784", Name:"busybox0", UID:"47677d2f-1f5c-4662-b714-d5b8abae950e", APIVersion:"v1", ResourceVersion:"1057", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-qrwfl
W0911 06:48:22.463] I0911 06:48:22.181710   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184493-10784", Name:"busybox1", UID:"af7744fa-815f-4a65-95dd-8cc8aa317ce1", APIVersion:"v1", ResourceVersion:"1059", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-bqzp5
I0911 06:48:22.563] Successful
I0911 06:48:22.564] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:22.565] error: replicationcontrollers "busybox0" pausing is not supported
I0911 06:48:22.565] error: replicationcontrollers "busybox1" pausing is not supported
I0911 06:48:22.565] has:Object 'Kind' is missing
I0911 06:48:22.565] Successful
I0911 06:48:22.566] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:22.566] error: replicationcontrollers "busybox0" pausing is not supported
I0911 06:48:22.566] error: replicationcontrollers "busybox1" pausing is not supported
I0911 06:48:22.567] has:replicationcontrollers "busybox0" pausing is not supported
I0911 06:48:22.567] Successful
I0911 06:48:22.567] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:22.568] error: replicationcontrollers "busybox0" pausing is not supported
I0911 06:48:22.568] error: replicationcontrollers "busybox1" pausing is not supported
I0911 06:48:22.568] has:replicationcontrollers "busybox1" pausing is not supported
I0911 06:48:22.587] Successful
I0911 06:48:22.587] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:22.588] error: replicationcontrollers "busybox0" resuming is not supported
I0911 06:48:22.588] error: replicationcontrollers "busybox1" resuming is not supported
I0911 06:48:22.588] has:Object 'Kind' is missing
I0911 06:48:22.588] Successful
I0911 06:48:22.589] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:22.589] error: replicationcontrollers "busybox0" resuming is not supported
I0911 06:48:22.589] error: replicationcontrollers "busybox1" resuming is not supported
I0911 06:48:22.590] has:replicationcontrollers "busybox0" resuming is not supported
I0911 06:48:22.590] Successful
I0911 06:48:22.591] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0911 06:48:22.591] error: replicationcontrollers "busybox0" resuming is not supported
I0911 06:48:22.591] error: replicationcontrollers "busybox1" resuming is not supported
I0911 06:48:22.591] has:replicationcontrollers "busybox0" resuming is not supported
I0911 06:48:22.678] replicationcontroller "busybox0" force deleted
I0911 06:48:22.693] replicationcontroller "busybox1" force deleted
W0911 06:48:22.794] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:48:22.795] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
W0911 06:48:22.863] E0911 06:48:22.862669   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:22.973] E0911 06:48:22.972936   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:23.700] Recording: run_namespace_tests
I0911 06:48:23.701] Running command: run_namespace_tests
I0911 06:48:23.721] 
I0911 06:48:23.723] +++ Running case: test-cmd.run_namespace_tests 
I0911 06:48:23.725] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:48:23.727] +++ command: run_namespace_tests
I0911 06:48:23.735] +++ [0911 06:48:23] Testing kubectl(v1:namespaces)
I0911 06:48:23.798] namespace/my-namespace created
I0911 06:48:23.897] core.sh:1308: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I0911 06:48:23.986] (Bnamespace "my-namespace" deleted
W0911 06:48:24.087] E0911 06:48:23.863977   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:24.087] E0911 06:48:23.974486   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:24.828] I0911 06:48:24.828116   53012 shared_informer.go:197] Waiting for caches to sync for resource quota
W0911 06:48:24.865] E0911 06:48:24.865371   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:24.929] I0911 06:48:24.928542   53012 shared_informer.go:204] Caches are synced for resource quota 
W0911 06:48:24.976] E0911 06:48:24.975644   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:25.370] I0911 06:48:25.369743   53012 shared_informer.go:197] Waiting for caches to sync for garbage collector
W0911 06:48:25.471] I0911 06:48:25.470129   53012 shared_informer.go:204] Caches are synced for garbage collector 
W0911 06:48:25.867] E0911 06:48:25.866627   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:25.978] E0911 06:48:25.977414   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:26.870] E0911 06:48:26.869468   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:26.979] E0911 06:48:26.978729   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:27.873] E0911 06:48:27.872071   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:27.981] E0911 06:48:27.980569   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:28.877] E0911 06:48:28.875008   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:28.983] E0911 06:48:28.982663   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:29.106] namespace/my-namespace condition met
I0911 06:48:29.187] Successful
I0911 06:48:29.188] message:Error from server (NotFound): namespaces "my-namespace" not found
I0911 06:48:29.188] has: not found
I0911 06:48:29.256] namespace/my-namespace created
I0911 06:48:29.362] core.sh:1317: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I0911 06:48:29.585] (BSuccessful
I0911 06:48:29.586] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0911 06:48:29.587] namespace "kube-node-lease" deleted
... skipping 29 lines ...
I0911 06:48:29.599] namespace "namespace-1568184470-24813" deleted
I0911 06:48:29.599] namespace "namespace-1568184471-3815" deleted
I0911 06:48:29.599] namespace "namespace-1568184473-2771" deleted
I0911 06:48:29.600] namespace "namespace-1568184475-23703" deleted
I0911 06:48:29.600] namespace "namespace-1568184493-1000" deleted
I0911 06:48:29.600] namespace "namespace-1568184493-10784" deleted
I0911 06:48:29.601] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0911 06:48:29.602] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0911 06:48:29.602] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0911 06:48:29.603] has:warning: deleting cluster-scoped resources
I0911 06:48:29.603] Successful
I0911 06:48:29.603] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0911 06:48:29.604] namespace "kube-node-lease" deleted
I0911 06:48:29.604] namespace "my-namespace" deleted
I0911 06:48:29.605] namespace "namespace-1568184383-14929" deleted
... skipping 27 lines ...
I0911 06:48:29.610] namespace "namespace-1568184470-24813" deleted
I0911 06:48:29.610] namespace "namespace-1568184471-3815" deleted
I0911 06:48:29.610] namespace "namespace-1568184473-2771" deleted
I0911 06:48:29.610] namespace "namespace-1568184475-23703" deleted
I0911 06:48:29.611] namespace "namespace-1568184493-1000" deleted
I0911 06:48:29.611] namespace "namespace-1568184493-10784" deleted
I0911 06:48:29.611] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0911 06:48:29.611] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0911 06:48:29.611] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0911 06:48:29.612] has:namespace "my-namespace" deleted
I0911 06:48:29.694] core.sh:1329: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
I0911 06:48:29.771] (Bnamespace/other created
I0911 06:48:29.865] core.sh:1333: Successful get namespaces/other {{.metadata.name}}: other
I0911 06:48:29.943] (Bcore.sh:1337: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:30.093] (Bpod/valid-pod created
I0911 06:48:30.181] core.sh:1341: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0911 06:48:30.277] (Bcore.sh:1343: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0911 06:48:30.353] (BSuccessful
I0911 06:48:30.353] message:error: a resource cannot be retrieved by name across all namespaces
I0911 06:48:30.354] has:a resource cannot be retrieved by name across all namespaces
I0911 06:48:30.442] core.sh:1350: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0911 06:48:30.519] (Bpod "valid-pod" force deleted
I0911 06:48:30.617] core.sh:1354: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:30.688] (Bnamespace "other" deleted
W0911 06:48:30.789] E0911 06:48:29.876700   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:30.790] E0911 06:48:29.983939   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:30.790] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:48:30.879] E0911 06:48:30.878573   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:30.985] E0911 06:48:30.985168   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:31.880] E0911 06:48:31.879964   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:31.987] E0911 06:48:31.987401   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:32.882] E0911 06:48:32.881565   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:32.989] E0911 06:48:32.988885   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:33.120] I0911 06:48:33.119805   53012 horizontal.go:341] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1568184493-10784
W0911 06:48:33.124] I0911 06:48:33.123870   53012 horizontal.go:341] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1568184493-10784
W0911 06:48:33.883] E0911 06:48:33.883168   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:33.990] E0911 06:48:33.990323   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:34.886] E0911 06:48:34.886364   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:34.993] E0911 06:48:34.992621   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:35.784] +++ exit code: 0
I0911 06:48:35.814] Recording: run_secrets_test
I0911 06:48:35.814] Running command: run_secrets_test
I0911 06:48:35.838] 
I0911 06:48:35.841] +++ Running case: test-cmd.run_secrets_test 
I0911 06:48:35.843] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 37 lines ...
I0911 06:48:36.089] metadata:
I0911 06:48:36.089]   creationTimestamp: null
I0911 06:48:36.089]   name: test
I0911 06:48:36.089] has not:example.com
I0911 06:48:36.183] core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
I0911 06:48:36.265] (Bnamespace/test-secrets created
W0911 06:48:36.366] E0911 06:48:35.887679   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:36.367] E0911 06:48:35.993793   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:36.367] I0911 06:48:36.074338   69193 loader.go:375] Config loaded from file:  /tmp/tmp.m9fj6vVh1G/.kube/config
I0911 06:48:36.468] core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
I0911 06:48:36.492] (Bcore.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:36.582] (Bsecret/test-secret created
I0911 06:48:36.668] core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
I0911 06:48:36.758] (Bcore.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
... skipping 9 lines ...
I0911 06:48:37.770] (Bcore.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
I0911 06:48:37.840] (Bsecret "test-secret" deleted
I0911 06:48:37.913] secret/test-secret created
I0911 06:48:38.014] core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
I0911 06:48:38.113] (Bcore.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
I0911 06:48:38.198] (Bsecret "test-secret" deleted
W0911 06:48:38.299] E0911 06:48:36.889261   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:38.299] E0911 06:48:36.994981   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:38.300] E0911 06:48:37.890465   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:38.300] E0911 06:48:37.995933   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:38.401] secret/secret-string-data created
I0911 06:48:38.501] core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
I0911 06:48:38.602] (Bcore.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
I0911 06:48:38.692] (Bcore.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
I0911 06:48:38.769] (Bsecret "secret-string-data" deleted
I0911 06:48:38.862] core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:48:39.055] (Bsecret "test-secret" deleted
I0911 06:48:39.133] namespace "test-secrets" deleted
W0911 06:48:39.234] E0911 06:48:38.891703   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:39.234] E0911 06:48:38.999081   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:39.235] I0911 06:48:39.177997   53012 namespace_controller.go:171] Namespace has been deleted my-namespace
W0911 06:48:39.672] I0911 06:48:39.671682   53012 namespace_controller.go:171] Namespace has been deleted kube-node-lease
W0911 06:48:39.689] I0911 06:48:39.689081   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184383-14929
W0911 06:48:39.692] I0911 06:48:39.691893   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184396-13246
W0911 06:48:39.700] I0911 06:48:39.699835   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184400-2061
W0911 06:48:39.715] I0911 06:48:39.714566   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184388-25069
W0911 06:48:39.732] I0911 06:48:39.731512   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184401-25870
W0911 06:48:39.742] I0911 06:48:39.742431   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184396-19732
W0911 06:48:39.745] I0911 06:48:39.744909   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184400-8069
W0911 06:48:39.747] I0911 06:48:39.747378   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184385-14364
W0911 06:48:39.849] I0911 06:48:39.849455   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184394-24338
W0911 06:48:39.894] E0911 06:48:39.893544   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:39.902] I0911 06:48:39.901739   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184410-3457
W0911 06:48:39.933] I0911 06:48:39.932536   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184411-8547
W0911 06:48:39.935] I0911 06:48:39.935316   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184421-28456
W0911 06:48:39.936] I0911 06:48:39.935323   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184422-24601
W0911 06:48:39.957] I0911 06:48:39.956587   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184425-2515
W0911 06:48:39.968] I0911 06:48:39.967460   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184424-30699
W0911 06:48:40.001] E0911 06:48:40.000585   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:40.001] I0911 06:48:40.000972   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184429-27795
W0911 06:48:40.017] I0911 06:48:40.016988   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184426-8467
W0911 06:48:40.018] I0911 06:48:40.018222   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184428-17646
W0911 06:48:40.104] I0911 06:48:40.104230   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184431-22060
W0911 06:48:40.141] I0911 06:48:40.141247   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184433-18424
W0911 06:48:40.217] I0911 06:48:40.216874   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184451-13872
... skipping 8 lines ...
W0911 06:48:40.365] I0911 06:48:40.364800   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184471-3815
W0911 06:48:40.396] I0911 06:48:40.394671   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184475-23703
W0911 06:48:40.414] I0911 06:48:40.413547   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184493-1000
W0911 06:48:40.414] I0911 06:48:40.413628   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184473-2771
W0911 06:48:40.480] I0911 06:48:40.479218   53012 namespace_controller.go:171] Namespace has been deleted namespace-1568184493-10784
W0911 06:48:40.765] I0911 06:48:40.764411   53012 namespace_controller.go:171] Namespace has been deleted other
W0911 06:48:40.895] E0911 06:48:40.894737   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:41.002] E0911 06:48:41.001952   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:41.896] E0911 06:48:41.896289   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:42.004] E0911 06:48:42.003479   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:42.900] E0911 06:48:42.899283   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:43.005] E0911 06:48:43.004996   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:43.904] E0911 06:48:43.902899   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:44.006] E0911 06:48:44.006285   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:44.253] +++ exit code: 0
I0911 06:48:44.295] Recording: run_configmap_tests
I0911 06:48:44.296] Running command: run_configmap_tests
I0911 06:48:44.326] 
I0911 06:48:44.329] +++ Running case: test-cmd.run_configmap_tests 
I0911 06:48:44.331] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 13 lines ...
I0911 06:48:45.421] (Bconfigmap/test-configmap created
I0911 06:48:45.495] configmap/test-binary-configmap created
I0911 06:48:45.584] core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
I0911 06:48:45.666] (Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
I0911 06:48:45.967] (Bconfigmap "test-configmap" deleted
I0911 06:48:46.065] configmap "test-binary-configmap" deleted
W0911 06:48:46.166] E0911 06:48:44.904398   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:46.167] E0911 06:48:45.007510   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:46.167] E0911 06:48:45.905720   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:46.167] E0911 06:48:46.008813   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:46.268] namespace "test-configmaps" deleted
W0911 06:48:46.907] E0911 06:48:46.906902   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:47.011] E0911 06:48:47.010665   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:47.908] E0911 06:48:47.908145   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:48.013] E0911 06:48:48.012371   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:48.910] E0911 06:48:48.909755   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:49.014] E0911 06:48:49.013657   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:49.228] I0911 06:48:49.227270   53012 namespace_controller.go:171] Namespace has been deleted test-secrets
W0911 06:48:49.911] E0911 06:48:49.911210   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:50.015] E0911 06:48:50.014975   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:50.916] E0911 06:48:50.915521   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:51.021] E0911 06:48:51.020032   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:51.301] +++ exit code: 0
I0911 06:48:51.335] Recording: run_client_config_tests
I0911 06:48:51.336] Running command: run_client_config_tests
I0911 06:48:51.356] 
I0911 06:48:51.358] +++ Running case: test-cmd.run_client_config_tests 
I0911 06:48:51.362] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:48:51.364] +++ command: run_client_config_tests
I0911 06:48:51.376] +++ [0911 06:48:51] Creating namespace namespace-1568184531-14833
I0911 06:48:51.445] namespace/namespace-1568184531-14833 created
I0911 06:48:51.533] Context "test" modified.
I0911 06:48:51.540] +++ [0911 06:48:51] Testing client config
I0911 06:48:51.628] Successful
I0911 06:48:51.628] message:error: stat missing: no such file or directory
I0911 06:48:51.628] has:missing: no such file or directory
I0911 06:48:51.722] Successful
I0911 06:48:51.723] message:error: stat missing: no such file or directory
I0911 06:48:51.723] has:missing: no such file or directory
I0911 06:48:51.808] Successful
I0911 06:48:51.809] message:error: stat missing: no such file or directory
I0911 06:48:51.809] has:missing: no such file or directory
I0911 06:48:51.893] Successful
I0911 06:48:51.894] message:Error in configuration: context was not found for specified context: missing-context
I0911 06:48:51.894] has:context was not found for specified context: missing-context
I0911 06:48:51.963] Successful
I0911 06:48:51.963] message:error: no server found for cluster "missing-cluster"
I0911 06:48:51.964] has:no server found for cluster "missing-cluster"
I0911 06:48:52.031] Successful
I0911 06:48:52.032] message:error: auth info "missing-user" does not exist
I0911 06:48:52.032] has:auth info "missing-user" does not exist
W0911 06:48:52.133] E0911 06:48:51.916942   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:52.133] E0911 06:48:52.021629   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:52.234] Successful
I0911 06:48:52.235] message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I0911 06:48:52.235] has:error loading config file
I0911 06:48:52.235] Successful
I0911 06:48:52.235] message:error: stat missing-config: no such file or directory
I0911 06:48:52.235] has:no such file or directory
I0911 06:48:52.246] +++ exit code: 0
I0911 06:48:52.275] Recording: run_service_accounts_tests
I0911 06:48:52.276] Running command: run_service_accounts_tests
I0911 06:48:52.297] 
I0911 06:48:52.300] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 7 lines ...
I0911 06:48:52.657] (Bnamespace/test-service-accounts created
I0911 06:48:52.748] core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
I0911 06:48:52.818] (Bserviceaccount/test-service-account created
I0911 06:48:52.904] core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
I0911 06:48:52.984] (Bserviceaccount "test-service-account" deleted
I0911 06:48:53.062] namespace "test-service-accounts" deleted
W0911 06:48:53.163] E0911 06:48:52.918624   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:53.164] E0911 06:48:53.023414   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:53.921] E0911 06:48:53.921049   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:54.025] E0911 06:48:54.024671   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:54.923] E0911 06:48:54.922349   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:55.026] E0911 06:48:55.026093   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:55.925] E0911 06:48:55.925197   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:56.027] E0911 06:48:56.027402   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:56.265] I0911 06:48:56.264988   53012 namespace_controller.go:171] Namespace has been deleted test-configmaps
W0911 06:48:56.927] E0911 06:48:56.926546   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:57.029] E0911 06:48:57.028687   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:57.928] E0911 06:48:57.928131   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:58.030] E0911 06:48:58.029905   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:48:58.224] +++ exit code: 0
I0911 06:48:58.269] Recording: run_job_tests
I0911 06:48:58.270] Running command: run_job_tests
I0911 06:48:58.291] 
I0911 06:48:58.293] +++ Running case: test-cmd.run_job_tests 
I0911 06:48:58.295] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 14 lines ...
I0911 06:48:59.114] Labels:                        run=pi
I0911 06:48:59.114] Annotations:                   <none>
I0911 06:48:59.114] Schedule:                      59 23 31 2 *
I0911 06:48:59.114] Concurrency Policy:            Allow
I0911 06:48:59.114] Suspend:                       False
I0911 06:48:59.114] Successful Job History Limit:  3
I0911 06:48:59.115] Failed Job History Limit:      1
I0911 06:48:59.115] Starting Deadline Seconds:     <unset>
I0911 06:48:59.115] Selector:                      <unset>
I0911 06:48:59.115] Parallelism:                   <unset>
I0911 06:48:59.115] Completions:                   <unset>
I0911 06:48:59.115] Pod Template:
I0911 06:48:59.115]   Labels:  run=pi
... skipping 19 lines ...
I0911 06:48:59.194] Successful
I0911 06:48:59.195] message:job.batch/test-job
I0911 06:48:59.195] has:job.batch/test-job
I0911 06:48:59.273] batch.sh:48: Successful get jobs {{range.items}}{{.metadata.name}}{{end}}: 
I0911 06:48:59.353] (Bjob.batch/test-job created
W0911 06:48:59.455] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0911 06:48:59.456] E0911 06:48:58.929675   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:59.457] E0911 06:48:59.031669   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:48:59.458] I0911 06:48:59.352352   53012 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"a787bac0-1f34-4024-85ee-e77cebb41f73", APIVersion:"batch/v1", ResourceVersion:"1381", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-4pstz
I0911 06:48:59.559] batch.sh:53: Successful get job/test-job --namespace=test-jobs {{.metadata.name}}: test-job
I0911 06:48:59.599] (BNAME       COMPLETIONS   DURATION   AGE
I0911 06:48:59.599] test-job   0/1           0s         0s
I0911 06:48:59.700] Name:           test-job
I0911 06:48:59.700] Namespace:      test-jobs
... skipping 3 lines ...
I0911 06:48:59.701]                 run=pi
I0911 06:48:59.702] Annotations:    cronjob.kubernetes.io/instantiate: manual
I0911 06:48:59.702] Controlled By:  CronJob/pi
I0911 06:48:59.702] Parallelism:    1
I0911 06:48:59.702] Completions:    1
I0911 06:48:59.702] Start Time:     Wed, 11 Sep 2019 06:48:59 +0000
I0911 06:48:59.703] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I0911 06:48:59.703] Pod Template:
I0911 06:48:59.704]   Labels:  controller-uid=a787bac0-1f34-4024-85ee-e77cebb41f73
I0911 06:48:59.704]            job-name=test-job
I0911 06:48:59.704]            run=pi
I0911 06:48:59.704]   Containers:
I0911 06:48:59.705]    pi:
... skipping 15 lines ...
I0911 06:48:59.710]   Type    Reason            Age   From            Message
I0911 06:48:59.710]   ----    ------            ----  ----            -------
I0911 06:48:59.710]   Normal  SuccessfulCreate  0s    job-controller  Created pod: test-job-4pstz
I0911 06:48:59.781] job.batch "test-job" deleted
I0911 06:48:59.881] cronjob.batch "pi" deleted
I0911 06:48:59.978] namespace "test-jobs" deleted
W0911 06:49:00.079] E0911 06:48:59.930644   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:00.079] E0911 06:49:00.032821   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:00.932] E0911 06:49:00.932315   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:01.034] E0911 06:49:01.034045   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:01.934] E0911 06:49:01.933720   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:02.036] E0911 06:49:02.035749   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:02.935] E0911 06:49:02.934959   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:03.037] E0911 06:49:03.037199   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:03.185] I0911 06:49:03.184811   53012 namespace_controller.go:171] Namespace has been deleted test-service-accounts
W0911 06:49:03.937] E0911 06:49:03.936477   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:04.045] E0911 06:49:04.042531   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:04.938] E0911 06:49:04.938173   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:05.044] E0911 06:49:05.043581   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:05.145] +++ exit code: 0
I0911 06:49:05.145] Recording: run_create_job_tests
I0911 06:49:05.145] Running command: run_create_job_tests
I0911 06:49:05.150] 
I0911 06:49:05.152] +++ Running case: test-cmd.run_create_job_tests 
I0911 06:49:05.155] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 6 lines ...
I0911 06:49:05.588] (Bjob.batch "test-job" deleted
I0911 06:49:05.675] job.batch/test-job-pi created
I0911 06:49:05.765] create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
I0911 06:49:05.887] (Bjob.batch "test-job-pi" deleted
W0911 06:49:05.989] I0911 06:49:05.425073   53012 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1568184545-12286", Name:"test-job", UID:"71918961-f25f-426e-81ff-30cc9e6d9710", APIVersion:"batch/v1", ResourceVersion:"1400", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-nc44t
W0911 06:49:05.990] I0911 06:49:05.670290   53012 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1568184545-12286", Name:"test-job-pi", UID:"0bfb1987-fd68-4692-8b05-07202766e602", APIVersion:"batch/v1", ResourceVersion:"1407", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-p4nlm
W0911 06:49:05.990] E0911 06:49:05.939879   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:05.998] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0911 06:49:06.047] E0911 06:49:06.046542   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:06.130] I0911 06:49:06.129677   53012 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1568184545-12286", Name:"my-pi", UID:"62252efc-d05b-42a3-86e7-19c9d11551cd", APIVersion:"batch/v1", ResourceVersion:"1415", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-dbm5f
I0911 06:49:06.231] cronjob.batch/test-pi created
I0911 06:49:06.232] job.batch/my-pi created
I0911 06:49:06.232] Successful
I0911 06:49:06.233] message:[perl -Mbignum=bpi -wle print bpi(10)]
I0911 06:49:06.233] has:perl -Mbignum=bpi -wle print bpi(10)
... skipping 13 lines ...
I0911 06:49:06.765] core.sh:1415: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:06.912] (Bpodtemplate/nginx created
I0911 06:49:07.003] core.sh:1419: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0911 06:49:07.078] (BNAME    CONTAINERS   IMAGES   POD LABELS
I0911 06:49:07.078] nginx   nginx        nginx    name=nginx
W0911 06:49:07.179] I0911 06:49:06.909662   49461 controller.go:606] quota admission added evaluator for: podtemplates
W0911 06:49:07.180] E0911 06:49:06.941322   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:07.180] E0911 06:49:07.048222   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:07.280] core.sh:1427: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0911 06:49:07.314] (Bpodtemplate "nginx" deleted
I0911 06:49:07.406] core.sh:1431: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:07.419] (B+++ exit code: 0
I0911 06:49:07.450] Recording: run_service_tests
I0911 06:49:07.450] Running command: run_service_tests
... skipping 34 lines ...
I0911 06:49:08.211] Port:              <unset>  6379/TCP
I0911 06:49:08.211] TargetPort:        6379/TCP
I0911 06:49:08.211] Endpoints:         <none>
I0911 06:49:08.211] Session Affinity:  None
I0911 06:49:08.211] Events:            <none>
I0911 06:49:08.211] (B
W0911 06:49:08.312] E0911 06:49:07.943138   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:08.313] E0911 06:49:08.049405   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:08.418] core.sh:868: Successful describe
I0911 06:49:08.418] Name:              redis-master
I0911 06:49:08.419] Namespace:         default
I0911 06:49:08.419] Labels:            app=redis
I0911 06:49:08.419]                    role=master
I0911 06:49:08.419]                    tier=backend
... skipping 209 lines ...
I0911 06:49:09.634]   selector:
I0911 06:49:09.635]     role: padawan
I0911 06:49:09.636]   sessionAffinity: None
I0911 06:49:09.636]   type: ClusterIP
I0911 06:49:09.637] status:
I0911 06:49:09.637]   loadBalancer: {}
W0911 06:49:09.738] E0911 06:49:08.944579   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:09.739] E0911 06:49:09.050698   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:09.739] error: you must specify resources by --filename when --local is set.
W0911 06:49:09.739] Example resource specifications include:
W0911 06:49:09.739]    '-f rsrc.yaml'
W0911 06:49:09.740]    '--filename=rsrc.json'
I0911 06:49:09.840] core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
I0911 06:49:09.979] (Bcore.sh:905: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I0911 06:49:10.092] (Bservice "redis-master" deleted
W0911 06:49:10.197] E0911 06:49:09.946373   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:10.197] E0911 06:49:10.054215   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:10.198] I0911 06:49:10.070209   53012 namespace_controller.go:171] Namespace has been deleted test-jobs
I0911 06:49:10.298] core.sh:912: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0911 06:49:10.299] (Bcore.sh:916: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0911 06:49:10.444] (Bservice/redis-master created
I0911 06:49:10.537] core.sh:920: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I0911 06:49:10.651] (Bcore.sh:924: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
... skipping 3 lines ...
I0911 06:49:11.188] core.sh:952: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
I0911 06:49:11.267] (Bservice "redis-master" deleted
I0911 06:49:11.359] service "service-v1-test" deleted
I0911 06:49:11.458] core.sh:960: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0911 06:49:11.543] (Bcore.sh:964: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0911 06:49:11.683] (Bservice/redis-master created
W0911 06:49:11.784] E0911 06:49:10.948612   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:11.784] E0911 06:49:11.055464   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:11.884] service/redis-slave created
I0911 06:49:11.914] core.sh:969: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
I0911 06:49:11.996] (BSuccessful
I0911 06:49:11.996] message:NAME           RSRC
I0911 06:49:11.996] kubernetes     145
I0911 06:49:11.996] redis-master   1449
... skipping 29 lines ...
I0911 06:49:13.540] +++ [0911 06:49:13] Creating namespace namespace-1568184553-16403
I0911 06:49:13.608] namespace/namespace-1568184553-16403 created
I0911 06:49:13.677] Context "test" modified.
I0911 06:49:13.683] +++ [0911 06:49:13] Testing kubectl(v1:daemonsets)
I0911 06:49:13.778] apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:13.924] (Bdaemonset.apps/bind created
W0911 06:49:14.025] E0911 06:49:11.950142   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:14.026] E0911 06:49:12.056908   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:14.027] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0911 06:49:14.027] I0911 06:49:12.906646   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"8ffb13ee-5189-44ea-87b0-8937d35cae79", APIVersion:"apps/v1", ResourceVersion:"1465", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
W0911 06:49:14.028] I0911 06:49:12.913051   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"18777d9f-6fdc-446e-9334-82ee79a4963f", APIVersion:"apps/v1", ResourceVersion:"1466", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-2qg5q
W0911 06:49:14.028] I0911 06:49:12.915369   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"18777d9f-6fdc-446e-9334-82ee79a4963f", APIVersion:"apps/v1", ResourceVersion:"1466", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-7wqgj
W0911 06:49:14.028] E0911 06:49:12.951210   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:14.029] E0911 06:49:13.058148   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:14.029] I0911 06:49:13.920481   49461 controller.go:606] quota admission added evaluator for: daemonsets.apps
W0911 06:49:14.029] I0911 06:49:13.931385   49461 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
W0911 06:49:14.030] E0911 06:49:13.954792   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:14.060] E0911 06:49:14.059557   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:14.161] apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
I0911 06:49:14.195] (Bdaemonset.apps/bind configured
I0911 06:49:14.289] apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
I0911 06:49:14.370] (Bdaemonset.apps/bind image updated
I0911 06:49:14.458] apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
I0911 06:49:14.542] (Bdaemonset.apps/bind env updated
... skipping 16 lines ...
I0911 06:49:15.326] +++ [0911 06:49:15] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
I0911 06:49:15.419] apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:15.577] (Bdaemonset.apps/bind created
I0911 06:49:15.680] apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1568184555-20163"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
I0911 06:49:15.680]  kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
I0911 06:49:15.787] (Bdaemonset.apps/bind skipped rollback (current template already matches revision 1)
W0911 06:49:15.890] E0911 06:49:14.956153   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:15.892] E0911 06:49:15.061380   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:15.958] E0911 06:49:15.957768   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:16.059] apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0911 06:49:16.059] (Bapps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0911 06:49:16.196] (Bdaemonset.apps/bind configured
I0911 06:49:16.291] apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0911 06:49:16.379] (Bapps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0911 06:49:16.469] (Bapps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
... skipping 15 lines ...
I0911 06:49:16.853] (Bapps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0911 06:49:16.935] (Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0911 06:49:17.032] (Bdaemonset.apps/bind rolled back
I0911 06:49:17.126] apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0911 06:49:17.219] (Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0911 06:49:17.334] (BSuccessful
I0911 06:49:17.334] message:error: unable to find specified revision 1000000 in history
I0911 06:49:17.335] has:unable to find specified revision
I0911 06:49:17.427] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0911 06:49:17.523] (Bapps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0911 06:49:17.635] (Bdaemonset.apps/bind rolled back
I0911 06:49:17.729] apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0911 06:49:17.810] (Bapps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 7 lines ...
I0911 06:49:18.148] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:49:18.148] +++ command: run_rc_tests
I0911 06:49:18.158] +++ [0911 06:49:18] Creating namespace namespace-1568184558-9196
I0911 06:49:18.232] namespace/namespace-1568184558-9196 created
I0911 06:49:18.304] Context "test" modified.
I0911 06:49:18.311] +++ [0911 06:49:18] Testing kubectl(v1:replicationcontrollers)
W0911 06:49:18.412] E0911 06:49:16.062725   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:18.413] E0911 06:49:16.959474   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:18.418] E0911 06:49:17.041396   53012 daemon_controller.go:302] namespace-1568184555-20163/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1568184555-20163", SelfLink:"/apis/apps/v1/namespaces/namespace-1568184555-20163/daemonsets/bind", UID:"cee2e3e3-c834-4934-b1e7-a33f51218892", ResourceVersion:"1531", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63703781355, loc:(*time.Location)(0x7750000)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1568184555-20163\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001b41320), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002b131b8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0029627e0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001b41340), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc000c7ce60)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002b1323c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
W0911 06:49:18.419] E0911 06:49:17.063993   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:18.419] E0911 06:49:17.966742   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:18.419] E0911 06:49:18.065486   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:18.520] core.sh:1046: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:18.589] (Breplicationcontroller/frontend created
I0911 06:49:18.685] replicationcontroller "frontend" deleted
I0911 06:49:18.783] core.sh:1051: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:18.874] (Bcore.sh:1055: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:19.018] (Breplicationcontroller/frontend created
... skipping 3 lines ...
I0911 06:49:19.255] Namespace:    namespace-1568184558-9196
I0911 06:49:19.255] Selector:     app=guestbook,tier=frontend
I0911 06:49:19.256] Labels:       app=guestbook
I0911 06:49:19.256]               tier=frontend
I0911 06:49:19.256] Annotations:  <none>
I0911 06:49:19.256] Replicas:     3 current / 3 desired
I0911 06:49:19.257] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:19.257] Pod Template:
I0911 06:49:19.257]   Labels:  app=guestbook
I0911 06:49:19.257]            tier=frontend
I0911 06:49:19.257]   Containers:
I0911 06:49:19.258]    php-redis:
I0911 06:49:19.258]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0911 06:49:19.365] Namespace:    namespace-1568184558-9196
I0911 06:49:19.365] Selector:     app=guestbook,tier=frontend
I0911 06:49:19.366] Labels:       app=guestbook
I0911 06:49:19.366]               tier=frontend
I0911 06:49:19.366] Annotations:  <none>
I0911 06:49:19.367] Replicas:     3 current / 3 desired
I0911 06:49:19.367] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:19.368] Pod Template:
I0911 06:49:19.368]   Labels:  app=guestbook
I0911 06:49:19.369]            tier=frontend
I0911 06:49:19.369]   Containers:
I0911 06:49:19.369]    php-redis:
I0911 06:49:19.370]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0911 06:49:19.474] Namespace:    namespace-1568184558-9196
I0911 06:49:19.475] Selector:     app=guestbook,tier=frontend
I0911 06:49:19.475] Labels:       app=guestbook
I0911 06:49:19.475]               tier=frontend
I0911 06:49:19.475] Annotations:  <none>
I0911 06:49:19.475] Replicas:     3 current / 3 desired
I0911 06:49:19.476] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:19.476] Pod Template:
I0911 06:49:19.476]   Labels:  app=guestbook
I0911 06:49:19.476]            tier=frontend
I0911 06:49:19.477]   Containers:
I0911 06:49:19.477]    php-redis:
I0911 06:49:19.477]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
I0911 06:49:19.584] Namespace:    namespace-1568184558-9196
I0911 06:49:19.584] Selector:     app=guestbook,tier=frontend
I0911 06:49:19.584] Labels:       app=guestbook
I0911 06:49:19.585]               tier=frontend
I0911 06:49:19.585] Annotations:  <none>
I0911 06:49:19.585] Replicas:     3 current / 3 desired
I0911 06:49:19.586] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:19.586] Pod Template:
I0911 06:49:19.587]   Labels:  app=guestbook
I0911 06:49:19.587]            tier=frontend
I0911 06:49:19.587]   Containers:
I0911 06:49:19.588]    php-redis:
I0911 06:49:19.588]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 13 lines ...
I0911 06:49:19.593]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-nqgfz
I0911 06:49:19.593]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-fgstc
I0911 06:49:19.593] (B
W0911 06:49:19.694] I0911 06:49:18.594771   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"9d15af42-81d6-4c68-b307-dce62eae160d", APIVersion:"v1", ResourceVersion:"1543", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dmmt2
W0911 06:49:19.694] I0911 06:49:18.598361   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"9d15af42-81d6-4c68-b307-dce62eae160d", APIVersion:"v1", ResourceVersion:"1543", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wffh9
W0911 06:49:19.695] I0911 06:49:18.598413   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"9d15af42-81d6-4c68-b307-dce62eae160d", APIVersion:"v1", ResourceVersion:"1543", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tkwtt
W0911 06:49:19.695] E0911 06:49:18.968319   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:19.695] I0911 06:49:19.021149   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"46797ccf-ed70-4252-8ef7-eb00d0a0f906", APIVersion:"v1", ResourceVersion:"1559", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lvshr
W0911 06:49:19.696] I0911 06:49:19.023432   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"46797ccf-ed70-4252-8ef7-eb00d0a0f906", APIVersion:"v1", ResourceVersion:"1559", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nqgfz
W0911 06:49:19.696] I0911 06:49:19.024488   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"46797ccf-ed70-4252-8ef7-eb00d0a0f906", APIVersion:"v1", ResourceVersion:"1559", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fgstc
W0911 06:49:19.696] E0911 06:49:19.066771   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:19.796] Successful describe rc:
I0911 06:49:19.797] Name:         frontend
I0911 06:49:19.797] Namespace:    namespace-1568184558-9196
I0911 06:49:19.797] Selector:     app=guestbook,tier=frontend
I0911 06:49:19.797] Labels:       app=guestbook
I0911 06:49:19.797]               tier=frontend
I0911 06:49:19.797] Annotations:  <none>
I0911 06:49:19.797] Replicas:     3 current / 3 desired
I0911 06:49:19.798] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:19.798] Pod Template:
I0911 06:49:19.798]   Labels:  app=guestbook
I0911 06:49:19.798]            tier=frontend
I0911 06:49:19.798]   Containers:
I0911 06:49:19.798]    php-redis:
I0911 06:49:19.798]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0911 06:49:19.825] Namespace:    namespace-1568184558-9196
I0911 06:49:19.825] Selector:     app=guestbook,tier=frontend
I0911 06:49:19.825] Labels:       app=guestbook
I0911 06:49:19.825]               tier=frontend
I0911 06:49:19.826] Annotations:  <none>
I0911 06:49:19.826] Replicas:     3 current / 3 desired
I0911 06:49:19.826] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:19.826] Pod Template:
I0911 06:49:19.827]   Labels:  app=guestbook
I0911 06:49:19.827]            tier=frontend
I0911 06:49:19.828]   Containers:
I0911 06:49:19.828]    php-redis:
I0911 06:49:19.828]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0911 06:49:19.943] Namespace:    namespace-1568184558-9196
I0911 06:49:19.943] Selector:     app=guestbook,tier=frontend
I0911 06:49:19.943] Labels:       app=guestbook
I0911 06:49:19.943]               tier=frontend
I0911 06:49:19.944] Annotations:  <none>
I0911 06:49:19.944] Replicas:     3 current / 3 desired
I0911 06:49:19.944] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:19.944] Pod Template:
I0911 06:49:19.944]   Labels:  app=guestbook
I0911 06:49:19.944]            tier=frontend
I0911 06:49:19.944]   Containers:
I0911 06:49:19.944]    php-redis:
I0911 06:49:19.944]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I0911 06:49:20.144] Namespace:    namespace-1568184558-9196
I0911 06:49:20.144] Selector:     app=guestbook,tier=frontend
I0911 06:49:20.144] Labels:       app=guestbook
I0911 06:49:20.145]               tier=frontend
I0911 06:49:20.145] Annotations:  <none>
I0911 06:49:20.146] Replicas:     3 current / 3 desired
I0911 06:49:20.146] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:20.146] Pod Template:
I0911 06:49:20.147]   Labels:  app=guestbook
I0911 06:49:20.147]            tier=frontend
I0911 06:49:20.147]   Containers:
I0911 06:49:20.147]    php-redis:
I0911 06:49:20.148]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 21 lines ...
I0911 06:49:20.846] (Breplicationcontroller/frontend scaled
I0911 06:49:20.936] core.sh:1099: Successful get rc frontend {{.spec.replicas}}: 3
I0911 06:49:21.025] (Bcore.sh:1103: Successful get rc frontend {{.spec.replicas}}: 3
I0911 06:49:21.096] (Breplicationcontroller/frontend scaled
I0911 06:49:21.185] core.sh:1107: Successful get rc frontend {{.spec.replicas}}: 2
I0911 06:49:21.268] (Breplicationcontroller "frontend" deleted
W0911 06:49:21.368] E0911 06:49:19.969706   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:21.369] E0911 06:49:20.074339   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:21.369] I0911 06:49:20.360527   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"46797ccf-ed70-4252-8ef7-eb00d0a0f906", APIVersion:"v1", ResourceVersion:"1569", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-lvshr
W0911 06:49:21.369] error: Expected replicas to be 3, was 2
W0911 06:49:21.370] I0911 06:49:20.849835   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"46797ccf-ed70-4252-8ef7-eb00d0a0f906", APIVersion:"v1", ResourceVersion:"1575", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qbxrg
W0911 06:49:21.370] E0911 06:49:20.971221   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:21.370] E0911 06:49:21.076331   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:21.370] I0911 06:49:21.102142   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"46797ccf-ed70-4252-8ef7-eb00d0a0f906", APIVersion:"v1", ResourceVersion:"1580", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-qbxrg
W0911 06:49:21.421] I0911 06:49:21.420999   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-master", UID:"708b627e-3703-4431-a383-8280a1f0d043", APIVersion:"v1", ResourceVersion:"1591", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-zmgvp
I0911 06:49:21.522] replicationcontroller/redis-master created
I0911 06:49:21.583] replicationcontroller/redis-slave created
I0911 06:49:21.663] replicationcontroller/redis-master scaled
I0911 06:49:21.666] replicationcontroller/redis-slave scaled
... skipping 5 lines ...
W0911 06:49:21.997] I0911 06:49:21.587480   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-slave", UID:"1d212157-d1ac-4120-a0ad-677388f558e1", APIVersion:"v1", ResourceVersion:"1596", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-vcwt4
W0911 06:49:21.998] I0911 06:49:21.666906   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-master", UID:"708b627e-3703-4431-a383-8280a1f0d043", APIVersion:"v1", ResourceVersion:"1603", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-949ks
W0911 06:49:21.998] I0911 06:49:21.669669   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-master", UID:"708b627e-3703-4431-a383-8280a1f0d043", APIVersion:"v1", ResourceVersion:"1603", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-45plj
W0911 06:49:21.999] I0911 06:49:21.670870   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-slave", UID:"1d212157-d1ac-4120-a0ad-677388f558e1", APIVersion:"v1", ResourceVersion:"1605", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-8f92j
W0911 06:49:21.999] I0911 06:49:21.670948   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-master", UID:"708b627e-3703-4431-a383-8280a1f0d043", APIVersion:"v1", ResourceVersion:"1603", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-l65n6
W0911 06:49:21.999] I0911 06:49:21.672349   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-slave", UID:"1d212157-d1ac-4120-a0ad-677388f558e1", APIVersion:"v1", ResourceVersion:"1605", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-hqcvv
W0911 06:49:21.999] E0911 06:49:21.977590   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:22.078] E0911 06:49:22.077730   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:22.087] I0911 06:49:22.087230   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment", UID:"8c4b037f-a774-438a-8eb4-0608b1a6deae", APIVersion:"apps/v1", ResourceVersion:"1621", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
W0911 06:49:22.091] I0911 06:49:22.090534   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-6986c7bc94", UID:"456023c5-5531-4267-8f37-9f978691443f", APIVersion:"apps/v1", ResourceVersion:"1622", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-2sdtf
W0911 06:49:22.094] I0911 06:49:22.093772   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-6986c7bc94", UID:"456023c5-5531-4267-8f37-9f978691443f", APIVersion:"apps/v1", ResourceVersion:"1622", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-99dpt
W0911 06:49:22.094] I0911 06:49:22.093918   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-6986c7bc94", UID:"456023c5-5531-4267-8f37-9f978691443f", APIVersion:"apps/v1", ResourceVersion:"1622", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-mpgjb
W0911 06:49:22.175] I0911 06:49:22.175180   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment", UID:"8c4b037f-a774-438a-8eb4-0608b1a6deae", APIVersion:"apps/v1", ResourceVersion:"1651", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6986c7bc94 to 1
W0911 06:49:22.182] I0911 06:49:22.181431   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-6986c7bc94", UID:"456023c5-5531-4267-8f37-9f978691443f", APIVersion:"apps/v1", ResourceVersion:"1652", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-mpgjb
... skipping 4 lines ...
I0911 06:49:22.327] (Bdeployment.apps "nginx-deployment" deleted
I0911 06:49:22.418] Successful
I0911 06:49:22.418] message:service/expose-test-deployment exposed
I0911 06:49:22.419] has:service/expose-test-deployment exposed
I0911 06:49:22.490] service "expose-test-deployment" deleted
I0911 06:49:22.569] Successful
I0911 06:49:22.569] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0911 06:49:22.569] See 'kubectl expose -h' for help and examples
I0911 06:49:22.570] has:invalid deployment: no selectors
I0911 06:49:22.698] deployment.apps/nginx-deployment created
I0911 06:49:22.786] core.sh:1146: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
I0911 06:49:22.867] (Bservice/nginx-deployment exposed
I0911 06:49:22.953] core.sh:1150: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
I0911 06:49:23.029] (Bdeployment.apps "nginx-deployment" deleted
I0911 06:49:23.038] service "nginx-deployment" deleted
W0911 06:49:23.139] I0911 06:49:22.700963   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment", UID:"d9941dd9-28d0-4f51-8e61-927f5d371c51", APIVersion:"apps/v1", ResourceVersion:"1676", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
W0911 06:49:23.140] I0911 06:49:22.704211   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-6986c7bc94", UID:"b00dcca1-3753-47a9-9aea-0b2c25ea7757", APIVersion:"apps/v1", ResourceVersion:"1677", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-gdlbw
W0911 06:49:23.140] I0911 06:49:22.706249   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-6986c7bc94", UID:"b00dcca1-3753-47a9-9aea-0b2c25ea7757", APIVersion:"apps/v1", ResourceVersion:"1677", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-htj4n
W0911 06:49:23.141] I0911 06:49:22.708261   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-6986c7bc94", UID:"b00dcca1-3753-47a9-9aea-0b2c25ea7757", APIVersion:"apps/v1", ResourceVersion:"1677", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-mnvwj
W0911 06:49:23.141] E0911 06:49:22.981859   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:23.141] E0911 06:49:23.079454   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:23.178] I0911 06:49:23.177362   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"367dde27-5b37-4f06-bf1b-62fabcc04ab2", APIVersion:"v1", ResourceVersion:"1704", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dx67z
W0911 06:49:23.181] I0911 06:49:23.179715   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"367dde27-5b37-4f06-bf1b-62fabcc04ab2", APIVersion:"v1", ResourceVersion:"1704", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pnfd4
W0911 06:49:23.181] I0911 06:49:23.179747   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"367dde27-5b37-4f06-bf1b-62fabcc04ab2", APIVersion:"v1", ResourceVersion:"1704", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-svz6j
I0911 06:49:23.282] replicationcontroller/frontend created
I0911 06:49:23.282] core.sh:1157: Successful get rc frontend {{.spec.replicas}}: 3
I0911 06:49:23.343] (Bservice/frontend exposed
... skipping 11 lines ...
I0911 06:49:24.420] service "frontend" deleted
I0911 06:49:24.426] service "frontend-2" deleted
I0911 06:49:24.431] service "frontend-3" deleted
I0911 06:49:24.444] service "frontend-4" deleted
I0911 06:49:24.451] service "frontend-5" deleted
I0911 06:49:24.539] Successful
I0911 06:49:24.540] message:error: cannot expose a Node
I0911 06:49:24.540] has:cannot expose
I0911 06:49:24.626] Successful
I0911 06:49:24.627] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I0911 06:49:24.627] has:metadata.name: Invalid value
I0911 06:49:24.709] Successful
I0911 06:49:24.709] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 7 lines ...
I0911 06:49:25.114] (Bservice "etcd-server" deleted
I0911 06:49:25.210] core.sh:1215: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
I0911 06:49:25.289] (Breplicationcontroller "frontend" deleted
I0911 06:49:25.374] core.sh:1219: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:25.451] (Bcore.sh:1223: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:25.588] (Breplicationcontroller/frontend created
W0911 06:49:25.689] E0911 06:49:23.983055   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:25.690] E0911 06:49:24.081008   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:25.690] E0911 06:49:24.984182   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:25.690] E0911 06:49:25.082772   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:25.691] I0911 06:49:25.591619   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"8d831ba0-09d4-4709-8a85-36c0f841b313", APIVersion:"v1", ResourceVersion:"1767", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xwdtj
W0911 06:49:25.692] I0911 06:49:25.595406   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"8d831ba0-09d4-4709-8a85-36c0f841b313", APIVersion:"v1", ResourceVersion:"1767", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-7fx6z
W0911 06:49:25.692] I0911 06:49:25.597510   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"8d831ba0-09d4-4709-8a85-36c0f841b313", APIVersion:"v1", ResourceVersion:"1767", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-57zkp
W0911 06:49:25.768] I0911 06:49:25.767645   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-slave", UID:"67dcecf2-2197-471a-ade8-f3763e01aec1", APIVersion:"v1", ResourceVersion:"1776", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-smgtd
W0911 06:49:25.770] I0911 06:49:25.770320   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"redis-slave", UID:"67dcecf2-2197-471a-ade8-f3763e01aec1", APIVersion:"v1", ResourceVersion:"1776", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-mdtxc
I0911 06:49:25.871] replicationcontroller/redis-slave created
... skipping 8 lines ...
I0911 06:49:26.522] (Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
I0911 06:49:26.604] core.sh:1246: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
I0911 06:49:26.676] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
I0911 06:49:26.757] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0911 06:49:26.842] core.sh:1250: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0911 06:49:26.914] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W0911 06:49:27.015] E0911 06:49:25.985451   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:27.016] E0911 06:49:26.083964   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:27.016] I0911 06:49:26.352004   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"e614057f-58d1-462a-b57b-829eaf20f79e", APIVersion:"v1", ResourceVersion:"1796", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5rxxf
W0911 06:49:27.016] I0911 06:49:26.354837   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"e614057f-58d1-462a-b57b-829eaf20f79e", APIVersion:"v1", ResourceVersion:"1796", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-c2zr2
W0911 06:49:27.017] I0911 06:49:26.355746   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184558-9196", Name:"frontend", UID:"e614057f-58d1-462a-b57b-829eaf20f79e", APIVersion:"v1", ResourceVersion:"1796", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ltrg9
W0911 06:49:27.017] Error: required flag(s) "max" not set
W0911 06:49:27.017] 
W0911 06:49:27.017] 
W0911 06:49:27.018] Examples:
W0911 06:49:27.018]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0911 06:49:27.018]   kubectl autoscale deployment foo --min=2 --max=10
W0911 06:49:27.018]   
... skipping 18 lines ...
W0911 06:49:27.025] 
W0911 06:49:27.025] Usage:
W0911 06:49:27.025]   kubectl autoscale (-f FILENAME | TYPE NAME | TYPE/NAME) [--min=MINPODS] --max=MAXPODS [--cpu-percent=CPU] [options]
W0911 06:49:27.026] 
W0911 06:49:27.026] Use "kubectl options" for a list of global command-line options (applies to all commands).
W0911 06:49:27.026] 
W0911 06:49:27.026] E0911 06:49:26.986887   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:27.086] E0911 06:49:27.085472   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:27.186] replicationcontroller "frontend" deleted
I0911 06:49:27.190] core.sh:1259: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:27.267] (BapiVersion: apps/v1
I0911 06:49:27.268] kind: Deployment
I0911 06:49:27.268] metadata:
I0911 06:49:27.268]   creationTimestamp: null
... skipping 24 lines ...
I0911 06:49:27.270]           limits:
I0911 06:49:27.270]             cpu: 300m
I0911 06:49:27.270]           requests:
I0911 06:49:27.270]             cpu: 300m
I0911 06:49:27.270]       terminationGracePeriodSeconds: 0
I0911 06:49:27.270] status: {}
W0911 06:49:27.371] Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
I0911 06:49:27.480] deployment.apps/nginx-deployment-resources created
I0911 06:49:27.575] core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
I0911 06:49:27.654] (Bcore.sh:1266: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0911 06:49:27.729] (Bcore.sh:1267: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0911 06:49:27.814] (Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0911 06:49:27.902] core.sh:1270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
... skipping 85 lines ...
W0911 06:49:28.923] I0911 06:49:27.483362   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources", UID:"0907d555-57a3-41cf-b084-15f73abb1638", APIVersion:"apps/v1", ResourceVersion:"1816", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
W0911 06:49:28.924] I0911 06:49:27.486260   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-67f8cfff5", UID:"2c207434-d051-49cc-928c-8030b9beb9de", APIVersion:"apps/v1", ResourceVersion:"1817", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-wgfjk
W0911 06:49:28.925] I0911 06:49:27.488268   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-67f8cfff5", UID:"2c207434-d051-49cc-928c-8030b9beb9de", APIVersion:"apps/v1", ResourceVersion:"1817", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-pcrmq
W0911 06:49:28.926] I0911 06:49:27.490512   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-67f8cfff5", UID:"2c207434-d051-49cc-928c-8030b9beb9de", APIVersion:"apps/v1", ResourceVersion:"1817", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-t5p72
W0911 06:49:28.926] I0911 06:49:27.817889   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources", UID:"0907d555-57a3-41cf-b084-15f73abb1638", APIVersion:"apps/v1", ResourceVersion:"1830", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
W0911 06:49:28.927] I0911 06:49:27.820601   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-55c547f795", UID:"3acfb7f1-15cc-4e6b-bbb3-a4b666565f3c", APIVersion:"apps/v1", ResourceVersion:"1831", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-qmjkz
W0911 06:49:28.927] E0911 06:49:27.988477   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:28.928] error: unable to find container named redis
W0911 06:49:28.928] E0911 06:49:28.086900   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:28.929] I0911 06:49:28.184401   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources", UID:"0907d555-57a3-41cf-b084-15f73abb1638", APIVersion:"apps/v1", ResourceVersion:"1840", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-55c547f795 to 0
W0911 06:49:28.930] I0911 06:49:28.190234   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-55c547f795", UID:"3acfb7f1-15cc-4e6b-bbb3-a4b666565f3c", APIVersion:"apps/v1", ResourceVersion:"1844", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-55c547f795-qmjkz
W0911 06:49:28.930] I0911 06:49:28.192577   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources", UID:"0907d555-57a3-41cf-b084-15f73abb1638", APIVersion:"apps/v1", ResourceVersion:"1843", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
W0911 06:49:28.930] I0911 06:49:28.197250   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-6d86564b45", UID:"b17369f0-ab30-40d4-9028-49c19db4f7f5", APIVersion:"apps/v1", ResourceVersion:"1848", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-m8xp5
W0911 06:49:28.931] I0911 06:49:28.460812   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources", UID:"0907d555-57a3-41cf-b084-15f73abb1638", APIVersion:"apps/v1", ResourceVersion:"1861", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
W0911 06:49:28.931] I0911 06:49:28.464739   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-67f8cfff5", UID:"2c207434-d051-49cc-928c-8030b9beb9de", APIVersion:"apps/v1", ResourceVersion:"1865", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-t5p72
W0911 06:49:28.932] I0911 06:49:28.466476   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources", UID:"0907d555-57a3-41cf-b084-15f73abb1638", APIVersion:"apps/v1", ResourceVersion:"1863", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
W0911 06:49:28.932] I0911 06:49:28.468731   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184558-9196", Name:"nginx-deployment-resources-6c478d4fdb", UID:"236821bc-5170-4c4e-be0b-45fd1759c46d", APIVersion:"apps/v1", ResourceVersion:"1869", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-zc7t7
W0911 06:49:28.932] error: you must specify resources by --filename when --local is set.
W0911 06:49:28.932] Example resource specifications include:
W0911 06:49:28.933]    '-f rsrc.yaml'
W0911 06:49:28.933]    '--filename=rsrc.json'
W0911 06:49:28.991] E0911 06:49:28.990515   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:29.089] E0911 06:49:29.088276   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:29.189] core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0911 06:49:29.190] (Bcore.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I0911 06:49:29.190] (Bcore.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
I0911 06:49:29.224] (Bdeployment.apps "nginx-deployment-resources" deleted
I0911 06:49:29.244] +++ exit code: 0
I0911 06:49:29.276] Recording: run_deployment_tests
... skipping 24 lines ...
I0911 06:49:30.163] message:apps/v1
I0911 06:49:30.163] has:apps/v1
W0911 06:49:30.263] I0911 06:49:29.530531   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"test-nginx-extensions", UID:"937caad8-d624-430a-862a-d7f88a95f55b", APIVersion:"apps/v1", ResourceVersion:"1897", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
W0911 06:49:30.264] I0911 06:49:29.534361   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"test-nginx-extensions-5559c76db7", UID:"3e7d6ec0-522d-4320-b9ef-7e90990b6341", APIVersion:"apps/v1", ResourceVersion:"1898", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-25x6b
W0911 06:49:30.265] I0911 06:49:29.931466   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"test-nginx-apps", UID:"20f00a31-4991-49be-a5e2-391c517efeaa", APIVersion:"apps/v1", ResourceVersion:"1911", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
W0911 06:49:30.265] I0911 06:49:29.933726   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"test-nginx-apps-79b9bd9585", UID:"5a6598ac-2639-4a52-b5d7-a8cbcfb9b3ce", APIVersion:"apps/v1", ResourceVersion:"1912", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-fhlzj
W0911 06:49:30.266] E0911 06:49:29.991786   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:30.266] E0911 06:49:30.089498   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:30.366] Successful describe rs:
I0911 06:49:30.367] Name:           test-nginx-apps-79b9bd9585
I0911 06:49:30.367] Namespace:      namespace-1568184569-16670
I0911 06:49:30.367] Selector:       app=test-nginx-apps,pod-template-hash=79b9bd9585
I0911 06:49:30.367] Labels:         app=test-nginx-apps
I0911 06:49:30.367]                 pod-template-hash=79b9bd9585
I0911 06:49:30.368] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I0911 06:49:30.368]                 deployment.kubernetes.io/max-replicas: 2
I0911 06:49:30.368]                 deployment.kubernetes.io/revision: 1
I0911 06:49:30.368] Controlled By:  Deployment/test-nginx-apps
I0911 06:49:30.368] Replicas:       1 current / 1 desired
I0911 06:49:30.368] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:30.369] Pod Template:
I0911 06:49:30.369]   Labels:  app=test-nginx-apps
I0911 06:49:30.369]            pod-template-hash=79b9bd9585
I0911 06:49:30.369]   Containers:
I0911 06:49:30.369]    nginx:
I0911 06:49:30.369]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 42 lines ...
I0911 06:49:31.350] apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:31.485] (Bdeployment.apps/nginx-deployment created
I0911 06:49:31.580] apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
I0911 06:49:31.672] (Bdeployment.apps "nginx-deployment" deleted
W0911 06:49:31.772] I0911 06:49:30.706511   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-with-command", UID:"58d64951-d1c5-4fa0-a836-d8efff251d39", APIVersion:"apps/v1", ResourceVersion:"1926", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
W0911 06:49:31.773] I0911 06:49:30.709048   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-with-command-757c6f58dd", UID:"cf3d5b4f-172c-4ed4-abc6-cc9a604c8a68", APIVersion:"apps/v1", ResourceVersion:"1927", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-zl2fl
W0911 06:49:31.773] E0911 06:49:30.993458   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:31.774] E0911 06:49:31.090855   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:31.774] I0911 06:49:31.107899   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"deployment-with-unixuserid", UID:"9523856c-febd-430c-a944-8e0981fc7e47", APIVersion:"apps/v1", ResourceVersion:"1940", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
W0911 06:49:31.775] I0911 06:49:31.111214   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"4444143d-7df5-4402-8e7e-a579f5f934c4", APIVersion:"apps/v1", ResourceVersion:"1941", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-tnmsr
W0911 06:49:31.775] I0911 06:49:31.487851   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"b1127edc-fb7f-4cb7-8585-aa4e091c72bd", APIVersion:"apps/v1", ResourceVersion:"1954", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
W0911 06:49:31.775] I0911 06:49:31.490314   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-6986c7bc94", UID:"1a56dd98-7654-49a4-8d5f-e581d12af15c", APIVersion:"apps/v1", ResourceVersion:"1955", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-2wzdb
W0911 06:49:31.776] I0911 06:49:31.493107   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-6986c7bc94", UID:"1a56dd98-7654-49a4-8d5f-e581d12af15c", APIVersion:"apps/v1", ResourceVersion:"1955", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-7jscz
W0911 06:49:31.776] I0911 06:49:31.493350   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-6986c7bc94", UID:"1a56dd98-7654-49a4-8d5f-e581d12af15c", APIVersion:"apps/v1", ResourceVersion:"1955", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-jx9t6
I0911 06:49:31.876] apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:31.877] (Bapps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:31.944] (Bapps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:32.014] (Bdeployment.apps/nginx-deployment created
I0911 06:49:32.105] apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
I0911 06:49:32.181] (Bdeployment.apps "nginx-deployment" deleted
W0911 06:49:32.282] E0911 06:49:31.994860   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:32.283] I0911 06:49:32.017951   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"6af24580-5445-45b6-9f79-faaf52bbfd54", APIVersion:"apps/v1", ResourceVersion:"1976", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
W0911 06:49:32.283] I0911 06:49:32.021829   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-7f6fc565b9", UID:"238ba043-42fb-4047-89fb-5e31944d4308", APIVersion:"apps/v1", ResourceVersion:"1977", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-9pgv5
W0911 06:49:32.284] E0911 06:49:32.092058   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:32.384] apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:32.426] (Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
I0911 06:49:32.584] (Breplicaset.apps "nginx-deployment-7f6fc565b9" deleted
I0911 06:49:32.671] apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:32.806] (Bdeployment.apps/nginx-deployment created
I0911 06:49:32.896] apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
... skipping 13 lines ...
I0911 06:49:34.194] apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0911 06:49:34.313] (Bdeployment.apps/nginx rolled back
W0911 06:49:34.414] I0911 06:49:32.809177   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"ba135286-e9ee-4da9-99ef-e5580c05c4fe", APIVersion:"apps/v1", ResourceVersion:"1995", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
W0911 06:49:34.415] I0911 06:49:32.813004   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-6986c7bc94", UID:"47781cd5-dc91-4577-9d20-83889964631b", APIVersion:"apps/v1", ResourceVersion:"1996", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-r29c9
W0911 06:49:34.415] I0911 06:49:32.815663   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-6986c7bc94", UID:"47781cd5-dc91-4577-9d20-83889964631b", APIVersion:"apps/v1", ResourceVersion:"1996", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-t5swj
W0911 06:49:34.416] I0911 06:49:32.815948   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-6986c7bc94", UID:"47781cd5-dc91-4577-9d20-83889964631b", APIVersion:"apps/v1", ResourceVersion:"1996", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-d99w6
W0911 06:49:34.416] E0911 06:49:32.996417   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:34.416] E0911 06:49:33.093281   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:34.417] I0911 06:49:33.439972   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx", UID:"6cc2dc7c-421b-43e7-89be-5284ec55b572", APIVersion:"apps/v1", ResourceVersion:"2019", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
W0911 06:49:34.417] I0911 06:49:33.442846   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-f87d999f7", UID:"d558e7dd-15ba-49c4-b585-6d7586e24db4", APIVersion:"apps/v1", ResourceVersion:"2020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-v67rx
W0911 06:49:34.417] I0911 06:49:33.445639   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-f87d999f7", UID:"d558e7dd-15ba-49c4-b585-6d7586e24db4", APIVersion:"apps/v1", ResourceVersion:"2020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-6wvfh
W0911 06:49:34.418] I0911 06:49:33.446001   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-f87d999f7", UID:"d558e7dd-15ba-49c4-b585-6d7586e24db4", APIVersion:"apps/v1", ResourceVersion:"2020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-p2ndr
W0911 06:49:34.418] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
W0911 06:49:34.418] I0911 06:49:33.916901   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx", UID:"6cc2dc7c-421b-43e7-89be-5284ec55b572", APIVersion:"apps/v1", ResourceVersion:"2033", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
W0911 06:49:34.419] I0911 06:49:33.918677   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-78487f9fd7", UID:"2614098a-c049-4e4f-b78d-31cf181e87c6", APIVersion:"apps/v1", ResourceVersion:"2034", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-jm6dj
W0911 06:49:34.419] E0911 06:49:33.997654   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:34.419] E0911 06:49:34.094587   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:35.000] E0911 06:49:34.999398   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:35.096] E0911 06:49:35.095831   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:35.398] apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0911 06:49:35.580] (Bapps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0911 06:49:35.664] (Bdeployment.apps/nginx rolled back
W0911 06:49:35.765] error: unable to find specified revision 1000000 in history
W0911 06:49:36.003] E0911 06:49:36.002070   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:36.097] E0911 06:49:36.097204   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:36.760] apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0911 06:49:36.852] (Bdeployment.apps/nginx paused
W0911 06:49:36.953] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
W0911 06:49:37.005] E0911 06:49:37.004543   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:37.019] error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
W0911 06:49:37.099] E0911 06:49:37.098590   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:37.199] deployment.apps/nginx resumed
I0911 06:49:37.214] deployment.apps/nginx rolled back
I0911 06:49:37.389]     deployment.kubernetes.io/revision-history: 1,3
W0911 06:49:37.570] error: desired revision (3) is different from the running revision (5)
I0911 06:49:37.670] deployment.apps/nginx restarted
W0911 06:49:37.772] I0911 06:49:37.689338   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx", UID:"6cc2dc7c-421b-43e7-89be-5284ec55b572", APIVersion:"apps/v1", ResourceVersion:"2064", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-78487f9fd7 to 0
W0911 06:49:37.773] I0911 06:49:37.697691   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-78487f9fd7", UID:"2614098a-c049-4e4f-b78d-31cf181e87c6", APIVersion:"apps/v1", ResourceVersion:"2068", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-78487f9fd7-jm6dj
W0911 06:49:37.773] I0911 06:49:37.701938   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx", UID:"6cc2dc7c-421b-43e7-89be-5284ec55b572", APIVersion:"apps/v1", ResourceVersion:"2067", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-5fc89989db to 1
W0911 06:49:37.773] I0911 06:49:37.707774   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-5fc89989db", UID:"e8b515a9-1d60-416c-aa9d-ab34174ab8dd", APIVersion:"apps/v1", ResourceVersion:"2072", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5fc89989db-htr7j
W0911 06:49:38.006] E0911 06:49:38.005871   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:38.100] E0911 06:49:38.099830   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:38.871] Successful
I0911 06:49:38.871] message:apiVersion: apps/v1
I0911 06:49:38.872] kind: ReplicaSet
I0911 06:49:38.872] metadata:
I0911 06:49:38.872]   annotations:
I0911 06:49:38.872]     deployment.kubernetes.io/desired-replicas: "3"
... skipping 48 lines ...
I0911 06:49:38.876]       terminationGracePeriodSeconds: 30
I0911 06:49:38.876] status:
I0911 06:49:38.876]   fullyLabeledReplicas: 1
I0911 06:49:38.876]   observedGeneration: 2
I0911 06:49:38.876]   replicas: 1
I0911 06:49:38.876] has:deployment.kubernetes.io/revision: "6"
W0911 06:49:39.007] E0911 06:49:39.007064   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:39.022] I0911 06:49:39.022113   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx2", UID:"a8b26d32-b891-4466-95e7-9539a402f189", APIVersion:"apps/v1", ResourceVersion:"2085", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
W0911 06:49:39.024] I0911 06:49:39.024099   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx2-57b7865cd9", UID:"780d8c1a-a7d4-4d77-91b3-15439c75acd7", APIVersion:"apps/v1", ResourceVersion:"2086", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-c9tj4
W0911 06:49:39.026] I0911 06:49:39.025865   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx2-57b7865cd9", UID:"780d8c1a-a7d4-4d77-91b3-15439c75acd7", APIVersion:"apps/v1", ResourceVersion:"2086", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-6jlfl
W0911 06:49:39.027] I0911 06:49:39.026240   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx2-57b7865cd9", UID:"780d8c1a-a7d4-4d77-91b3-15439c75acd7", APIVersion:"apps/v1", ResourceVersion:"2086", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-9pf9v
W0911 06:49:39.102] E0911 06:49:39.101833   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:39.203] deployment.apps/nginx2 created
I0911 06:49:39.203] deployment.apps "nginx2" deleted
I0911 06:49:39.203] deployment.apps "nginx" deleted
I0911 06:49:39.294] apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:39.473] (Bdeployment.apps/nginx-deployment created
W0911 06:49:39.574] I0911 06:49:39.476456   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"f66671b6-b9d4-4985-80a8-6fb454841a09", APIVersion:"apps/v1", ResourceVersion:"2119", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
... skipping 21 lines ...
I0911 06:49:41.439] (Bapps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0911 06:49:41.508] (Bdeployment.apps "nginx-deployment" deleted
I0911 06:49:41.599] apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:41.733] (Bdeployment.apps/nginx-deployment created
W0911 06:49:41.834] I0911 06:49:39.850995   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"f66671b6-b9d4-4985-80a8-6fb454841a09", APIVersion:"apps/v1", ResourceVersion:"2133", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
W0911 06:49:41.835] I0911 06:49:39.853828   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-59df9b5f5b", UID:"e1121b0c-33f8-47b3-8b0b-df4673437194", APIVersion:"apps/v1", ResourceVersion:"2134", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-4ddgf
W0911 06:49:41.836] E0911 06:49:40.008331   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:41.836] E0911 06:49:40.103778   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:41.836] error: unable to find container named "redis"
W0911 06:49:41.837] I0911 06:49:40.967227   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"f66671b6-b9d4-4985-80a8-6fb454841a09", APIVersion:"apps/v1", ResourceVersion:"2152", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
W0911 06:49:41.837] I0911 06:49:40.972664   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-598d4d68b4", UID:"eeadd3ff-3156-4763-ba4c-3611a1fadc1e", APIVersion:"apps/v1", ResourceVersion:"2156", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-jdcdb
W0911 06:49:41.838] I0911 06:49:40.973443   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"f66671b6-b9d4-4985-80a8-6fb454841a09", APIVersion:"apps/v1", ResourceVersion:"2155", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
W0911 06:49:41.838] I0911 06:49:40.980697   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-7d758dbc54", UID:"58416b61-6b5c-44e3-8b3b-142c20bf915f", APIVersion:"apps/v1", ResourceVersion:"2160", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-bcfkv
W0911 06:49:41.839] E0911 06:49:41.009574   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:41.839] E0911 06:49:41.107859   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:41.839] I0911 06:49:41.521419   53012 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1568184558-9196
W0911 06:49:41.840] I0911 06:49:41.737395   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2185", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
W0911 06:49:41.840] I0911 06:49:41.740894   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-598d4d68b4", UID:"b800d9cc-b898-4dc0-91b5-3e325ede937c", APIVersion:"apps/v1", ResourceVersion:"2186", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-j4nhh
W0911 06:49:41.841] I0911 06:49:41.742880   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-598d4d68b4", UID:"b800d9cc-b898-4dc0-91b5-3e325ede937c", APIVersion:"apps/v1", ResourceVersion:"2186", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-b8xpg
W0911 06:49:41.841] I0911 06:49:41.743772   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-598d4d68b4", UID:"b800d9cc-b898-4dc0-91b5-3e325ede937c", APIVersion:"apps/v1", ResourceVersion:"2186", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-6mld7
I0911 06:49:41.942] configmap/test-set-env-config created
... skipping 2 lines ...
I0911 06:49:42.206] (Bapps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
I0911 06:49:42.289] (Bapps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
I0911 06:49:42.389] (Bdeployment.apps/nginx-deployment env updated
I0911 06:49:42.485] apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
I0911 06:49:42.573] (Bapps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
I0911 06:49:42.672] (Bdeployment.apps/nginx-deployment env updated
W0911 06:49:42.773] E0911 06:49:42.010918   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:42.774] E0911 06:49:42.109152   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:42.774] I0911 06:49:42.392955   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2202", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
W0911 06:49:42.775] I0911 06:49:42.397648   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-6b9f7756b4", UID:"f353e90d-61a1-4bea-b141-1f11a0a04d60", APIVersion:"apps/v1", ResourceVersion:"2203", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-b8qh5
W0911 06:49:42.775] I0911 06:49:42.682162   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2212", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
W0911 06:49:42.776] I0911 06:49:42.686538   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-598d4d68b4", UID:"b800d9cc-b898-4dc0-91b5-3e325ede937c", APIVersion:"apps/v1", ResourceVersion:"2216", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-b8xpg
W0911 06:49:42.776] I0911 06:49:42.688666   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2214", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-754bf964c8 to 1
W0911 06:49:42.777] I0911 06:49:42.695670   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-754bf964c8", UID:"8c7f0322-dd57-43f3-aae0-86ea80143b5f", APIVersion:"apps/v1", ResourceVersion:"2220", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-754bf964c8-qd4bt
... skipping 5 lines ...
W0911 06:49:43.085] I0911 06:49:42.908219   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2236", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-c6d5c5c7b to 1
W0911 06:49:43.085] I0911 06:49:42.911717   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-c6d5c5c7b", UID:"41e2bae9-ae62-48c5-adb8-6560afef7ce8", APIVersion:"apps/v1", ResourceVersion:"2240", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-c6d5c5c7b-fk7vc
W0911 06:49:43.086] I0911 06:49:42.992456   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2254", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 0
W0911 06:49:43.086] I0911 06:49:42.998132   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-598d4d68b4", UID:"b800d9cc-b898-4dc0-91b5-3e325ede937c", APIVersion:"apps/v1", ResourceVersion:"2258", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-6mld7
W0911 06:49:43.087] I0911 06:49:42.999462   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2257", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
W0911 06:49:43.088] I0911 06:49:43.002541   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-5958f7687", UID:"64d0e557-afc1-4de8-816e-e17ba84267df", APIVersion:"apps/v1", ResourceVersion:"2262", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-kshgn
W0911 06:49:43.088] E0911 06:49:43.011685   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:43.096] I0911 06:49:43.095654   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2275", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-5958f7687 to 0
W0911 06:49:43.111] E0911 06:49:43.110848   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:43.212] deployment.apps/nginx-deployment env updated
I0911 06:49:43.213] deployment.apps/nginx-deployment env updated
I0911 06:49:43.266] deployment.apps/nginx-deployment env updated
I0911 06:49:43.346] deployment.apps "nginx-deployment" deleted
I0911 06:49:43.434] configmap "test-set-env-config" deleted
I0911 06:49:43.520] secret "test-set-env-secret" deleted
... skipping 17 lines ...
I0911 06:49:44.367] (Breplicaset.apps/frontend-no-cascade created
I0911 06:49:44.465] apps.sh:527: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0911 06:49:44.467] (B+++ [0911 06:49:44] Deleting rs
W0911 06:49:44.567] I0911 06:49:43.293955   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment", UID:"d8e533f3-0309-457d-8fa5-33d821ac0e46", APIVersion:"apps/v1", ResourceVersion:"2280", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-d74969475 to 1
W0911 06:49:44.568] I0911 06:49:43.333595   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-5958f7687", UID:"64d0e557-afc1-4de8-816e-e17ba84267df", APIVersion:"apps/v1", ResourceVersion:"2278", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-5958f7687-kshgn
W0911 06:49:44.568] I0911 06:49:43.380930   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184569-16670", Name:"nginx-deployment-d74969475", UID:"fe98f55c-f0f4-4742-b358-1a6f9a3fc80e", APIVersion:"apps/v1", ResourceVersion:"2283", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-d74969475-dgp7r
W0911 06:49:44.569] E0911 06:49:43.578791   53012 replica_set.go:450] Sync "namespace-1568184569-16670/nginx-deployment-5958f7687" failed with replicasets.apps "nginx-deployment-5958f7687" not found
W0911 06:49:44.569] E0911 06:49:43.728972   53012 replica_set.go:450] Sync "namespace-1568184569-16670/nginx-deployment-d74969475" failed with replicasets.apps "nginx-deployment-d74969475" not found
W0911 06:49:44.569] E0911 06:49:43.778753   53012 replica_set.go:450] Sync "namespace-1568184569-16670/nginx-deployment-868b664cb5" failed with replicasets.apps "nginx-deployment-868b664cb5" not found
W0911 06:49:44.569] I0911 06:49:43.989125   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"3561a055-4694-43ef-9b14-d0feeb88d288", APIVersion:"apps/v1", ResourceVersion:"2312", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wvzzj
W0911 06:49:44.570] I0911 06:49:43.990616   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"3561a055-4694-43ef-9b14-d0feeb88d288", APIVersion:"apps/v1", ResourceVersion:"2312", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sjmw7
W0911 06:49:44.570] I0911 06:49:43.991825   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"3561a055-4694-43ef-9b14-d0feeb88d288", APIVersion:"apps/v1", ResourceVersion:"2312", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rpjfr
W0911 06:49:44.570] E0911 06:49:44.013135   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:44.571] E0911 06:49:44.112490   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:44.571] E0911 06:49:44.178858   53012 replica_set.go:450] Sync "namespace-1568184583-28530/frontend" failed with replicasets.apps "frontend" not found
W0911 06:49:44.572] I0911 06:49:44.370651   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend-no-cascade", UID:"1c250cdc-1f73-4ac8-95c8-6b699505320f", APIVersion:"apps/v1", ResourceVersion:"2327", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-fc9kz
W0911 06:49:44.572] I0911 06:49:44.372758   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend-no-cascade", UID:"1c250cdc-1f73-4ac8-95c8-6b699505320f", APIVersion:"apps/v1", ResourceVersion:"2327", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-69mwz
W0911 06:49:44.573] I0911 06:49:44.379175   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend-no-cascade", UID:"1c250cdc-1f73-4ac8-95c8-6b699505320f", APIVersion:"apps/v1", ResourceVersion:"2327", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-8qv8w
W0911 06:49:44.629] E0911 06:49:44.629057   53012 replica_set.go:450] Sync "namespace-1568184583-28530/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
I0911 06:49:44.730] replicaset.apps "frontend-no-cascade" deleted
I0911 06:49:44.730] apps.sh:531: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:44.757] (Bapps.sh:533: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0911 06:49:44.831] (Bpod "frontend-no-cascade-69mwz" deleted
I0911 06:49:44.837] pod "frontend-no-cascade-8qv8w" deleted
I0911 06:49:44.843] pod "frontend-no-cascade-fc9kz" deleted
... skipping 6 lines ...
I0911 06:49:45.383] Namespace:    namespace-1568184583-28530
I0911 06:49:45.383] Selector:     app=guestbook,tier=frontend
I0911 06:49:45.383] Labels:       app=guestbook
I0911 06:49:45.383]               tier=frontend
I0911 06:49:45.383] Annotations:  <none>
I0911 06:49:45.384] Replicas:     3 current / 3 desired
I0911 06:49:45.384] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:45.384] Pod Template:
I0911 06:49:45.384]   Labels:  app=guestbook
I0911 06:49:45.384]            tier=frontend
I0911 06:49:45.384]   Containers:
I0911 06:49:45.384]    php-redis:
I0911 06:49:45.384]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0911 06:49:45.503] Namespace:    namespace-1568184583-28530
I0911 06:49:45.503] Selector:     app=guestbook,tier=frontend
I0911 06:49:45.503] Labels:       app=guestbook
I0911 06:49:45.503]               tier=frontend
I0911 06:49:45.504] Annotations:  <none>
I0911 06:49:45.504] Replicas:     3 current / 3 desired
I0911 06:49:45.504] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:45.504] Pod Template:
I0911 06:49:45.504]   Labels:  app=guestbook
I0911 06:49:45.504]            tier=frontend
I0911 06:49:45.504]   Containers:
I0911 06:49:45.504]    php-redis:
I0911 06:49:45.504]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I0911 06:49:45.610] Namespace:    namespace-1568184583-28530
I0911 06:49:45.610] Selector:     app=guestbook,tier=frontend
I0911 06:49:45.610] Labels:       app=guestbook
I0911 06:49:45.611]               tier=frontend
I0911 06:49:45.611] Annotations:  <none>
I0911 06:49:45.611] Replicas:     3 current / 3 desired
I0911 06:49:45.611] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:45.611] Pod Template:
I0911 06:49:45.611]   Labels:  app=guestbook
I0911 06:49:45.611]            tier=frontend
I0911 06:49:45.611]   Containers:
I0911 06:49:45.612]    php-redis:
I0911 06:49:45.612]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I0911 06:49:45.708] Namespace:    namespace-1568184583-28530
I0911 06:49:45.708] Selector:     app=guestbook,tier=frontend
I0911 06:49:45.708] Labels:       app=guestbook
I0911 06:49:45.708]               tier=frontend
I0911 06:49:45.708] Annotations:  <none>
I0911 06:49:45.708] Replicas:     3 current / 3 desired
I0911 06:49:45.708] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:45.709] Pod Template:
I0911 06:49:45.709]   Labels:  app=guestbook
I0911 06:49:45.709]            tier=frontend
I0911 06:49:45.709]   Containers:
I0911 06:49:45.709]    php-redis:
I0911 06:49:45.709]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 10 lines ...
I0911 06:49:45.710]   Type    Reason            Age   From                   Message
I0911 06:49:45.710]   ----    ------            ----  ----                   -------
I0911 06:49:45.710]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-m4vc4
I0911 06:49:45.710]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-49nq4
I0911 06:49:45.710]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-6v7cj
I0911 06:49:45.710] (B
W0911 06:49:45.811] E0911 06:49:45.014600   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:45.811] E0911 06:49:45.113687   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:45.812] I0911 06:49:45.151970   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"085e3def-c31d-4e82-a7e2-0dc5f85f1892", APIVersion:"apps/v1", ResourceVersion:"2347", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-m4vc4
W0911 06:49:45.812] I0911 06:49:45.154239   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"085e3def-c31d-4e82-a7e2-0dc5f85f1892", APIVersion:"apps/v1", ResourceVersion:"2347", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-49nq4
W0911 06:49:45.812] I0911 06:49:45.154912   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"085e3def-c31d-4e82-a7e2-0dc5f85f1892", APIVersion:"apps/v1", ResourceVersion:"2347", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6v7cj
I0911 06:49:45.913] Successful describe rs:
I0911 06:49:45.913] Name:         frontend
I0911 06:49:45.913] Namespace:    namespace-1568184583-28530
I0911 06:49:45.913] Selector:     app=guestbook,tier=frontend
I0911 06:49:45.913] Labels:       app=guestbook
I0911 06:49:45.913]               tier=frontend
I0911 06:49:45.913] Annotations:  <none>
I0911 06:49:45.913] Replicas:     3 current / 3 desired
I0911 06:49:45.914] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:45.914] Pod Template:
I0911 06:49:45.914]   Labels:  app=guestbook
I0911 06:49:45.914]            tier=frontend
I0911 06:49:45.914]   Containers:
I0911 06:49:45.914]    php-redis:
I0911 06:49:45.914]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0911 06:49:45.934] Namespace:    namespace-1568184583-28530
I0911 06:49:45.934] Selector:     app=guestbook,tier=frontend
I0911 06:49:45.934] Labels:       app=guestbook
I0911 06:49:45.935]               tier=frontend
I0911 06:49:45.935] Annotations:  <none>
I0911 06:49:45.935] Replicas:     3 current / 3 desired
I0911 06:49:45.935] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:45.935] Pod Template:
I0911 06:49:45.935]   Labels:  app=guestbook
I0911 06:49:45.935]            tier=frontend
I0911 06:49:45.936]   Containers:
I0911 06:49:45.936]    php-redis:
I0911 06:49:45.936]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0911 06:49:46.034] Namespace:    namespace-1568184583-28530
I0911 06:49:46.034] Selector:     app=guestbook,tier=frontend
I0911 06:49:46.034] Labels:       app=guestbook
I0911 06:49:46.035]               tier=frontend
I0911 06:49:46.035] Annotations:  <none>
I0911 06:49:46.035] Replicas:     3 current / 3 desired
I0911 06:49:46.035] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:46.035] Pod Template:
I0911 06:49:46.036]   Labels:  app=guestbook
I0911 06:49:46.036]            tier=frontend
I0911 06:49:46.036]   Containers:
I0911 06:49:46.036]    php-redis:
I0911 06:49:46.037]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I0911 06:49:46.148] Namespace:    namespace-1568184583-28530
I0911 06:49:46.148] Selector:     app=guestbook,tier=frontend
I0911 06:49:46.148] Labels:       app=guestbook
I0911 06:49:46.149]               tier=frontend
I0911 06:49:46.149] Annotations:  <none>
I0911 06:49:46.149] Replicas:     3 current / 3 desired
I0911 06:49:46.149] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:46.149] Pod Template:
I0911 06:49:46.149]   Labels:  app=guestbook
I0911 06:49:46.149]            tier=frontend
I0911 06:49:46.149]   Containers:
I0911 06:49:46.149]    php-redis:
I0911 06:49:46.149]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 99 lines ...
I0911 06:49:46.305] Tolerations:           <none>
I0911 06:49:46.305] Events:                <none>
I0911 06:49:46.371] (Bapps.sh:566: Successful get rs frontend {{.spec.replicas}}: 3
I0911 06:49:46.455] (Breplicaset.apps/frontend scaled
I0911 06:49:46.546] apps.sh:570: Successful get rs frontend {{.spec.replicas}}: 2
I0911 06:49:46.681] (Bdeployment.apps/scale-1 created
W0911 06:49:46.782] E0911 06:49:46.015773   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:46.782] E0911 06:49:46.114969   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:46.783] I0911 06:49:46.459995   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"085e3def-c31d-4e82-a7e2-0dc5f85f1892", APIVersion:"apps/v1", ResourceVersion:"2357", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-m4vc4
W0911 06:49:46.783] I0911 06:49:46.684386   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-1", UID:"511333ce-3476-4056-b887-6520f6b34a67", APIVersion:"apps/v1", ResourceVersion:"2363", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
W0911 06:49:46.784] I0911 06:49:46.686586   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-1-5c5565bcd9", UID:"a504dbbb-3952-42e7-8335-1aa69782194c", APIVersion:"apps/v1", ResourceVersion:"2364", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-tpw7r
W0911 06:49:46.820] I0911 06:49:46.819936   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-2", UID:"98d8a50d-4cb5-4560-85f2-04d145db154c", APIVersion:"apps/v1", ResourceVersion:"2373", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
W0911 06:49:46.823] I0911 06:49:46.823161   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-2-5c5565bcd9", UID:"1c25bbdc-8a60-4b83-b77e-8e4b689017d8", APIVersion:"apps/v1", ResourceVersion:"2374", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-bgjdt
I0911 06:49:46.924] deployment.apps/scale-2 created
... skipping 15 lines ...
I0911 06:49:47.986] (Breplicaset.apps "frontend" deleted
I0911 06:49:48.077] deployment.apps "scale-1" deleted
I0911 06:49:48.080] deployment.apps "scale-2" deleted
I0911 06:49:48.085] deployment.apps "scale-3" deleted
W0911 06:49:48.186] I0911 06:49:46.962546   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-3", UID:"dec1a201-6b38-405e-906d-c07574270e39", APIVersion:"apps/v1", ResourceVersion:"2383", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
W0911 06:49:48.187] I0911 06:49:46.965407   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-3-5c5565bcd9", UID:"f3ae5d26-1f72-4d6a-89be-8ad2e127625b", APIVersion:"apps/v1", ResourceVersion:"2384", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-qc42s
W0911 06:49:48.187] E0911 06:49:47.017172   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:48.187] E0911 06:49:47.117709   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:48.188] I0911 06:49:47.289742   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-1", UID:"511333ce-3476-4056-b887-6520f6b34a67", APIVersion:"apps/v1", ResourceVersion:"2393", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
W0911 06:49:48.188] I0911 06:49:47.293504   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-2", UID:"98d8a50d-4cb5-4560-85f2-04d145db154c", APIVersion:"apps/v1", ResourceVersion:"2395", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
W0911 06:49:48.188] I0911 06:49:47.293834   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-1-5c5565bcd9", UID:"a504dbbb-3952-42e7-8335-1aa69782194c", APIVersion:"apps/v1", ResourceVersion:"2394", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-kh8g5
W0911 06:49:48.189] I0911 06:49:47.295525   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-2-5c5565bcd9", UID:"1c25bbdc-8a60-4b83-b77e-8e4b689017d8", APIVersion:"apps/v1", ResourceVersion:"2398", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-vvvfm
W0911 06:49:48.189] I0911 06:49:47.622831   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-1", UID:"511333ce-3476-4056-b887-6520f6b34a67", APIVersion:"apps/v1", ResourceVersion:"2413", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 3
W0911 06:49:48.189] I0911 06:49:47.626083   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-1-5c5565bcd9", UID:"a504dbbb-3952-42e7-8335-1aa69782194c", APIVersion:"apps/v1", ResourceVersion:"2414", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-hc86g
W0911 06:49:48.190] I0911 06:49:47.632784   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-2", UID:"98d8a50d-4cb5-4560-85f2-04d145db154c", APIVersion:"apps/v1", ResourceVersion:"2415", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 3
W0911 06:49:48.190] I0911 06:49:47.635630   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184583-28530", Name:"scale-3", UID:"dec1a201-6b38-405e-906d-c07574270e39", APIVersion:"apps/v1", ResourceVersion:"2421", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 3
W0911 06:49:48.190] I0911 06:49:47.636789   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-2-5c5565bcd9", UID:"1c25bbdc-8a60-4b83-b77e-8e4b689017d8", APIVersion:"apps/v1", ResourceVersion:"2423", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-cqhg6
W0911 06:49:48.191] I0911 06:49:47.637678   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-3-5c5565bcd9", UID:"f3ae5d26-1f72-4d6a-89be-8ad2e127625b", APIVersion:"apps/v1", ResourceVersion:"2425", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-6j8v4
W0911 06:49:48.191] I0911 06:49:47.640519   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"scale-3-5c5565bcd9", UID:"f3ae5d26-1f72-4d6a-89be-8ad2e127625b", APIVersion:"apps/v1", ResourceVersion:"2425", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-qwmzg
W0911 06:49:48.191] I0911 06:49:47.969029   53012 horizontal.go:341] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1568184569-16670
W0911 06:49:48.192] E0911 06:49:48.018346   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:48.192] E0911 06:49:48.119402   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:48.259] I0911 06:49:48.259013   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"1c7bc6ed-65e1-49ba-9dd0-0c369b449fe9", APIVersion:"apps/v1", ResourceVersion:"2474", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8jlbq
W0911 06:49:48.262] I0911 06:49:48.261773   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"1c7bc6ed-65e1-49ba-9dd0-0c369b449fe9", APIVersion:"apps/v1", ResourceVersion:"2474", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lkml5
W0911 06:49:48.263] I0911 06:49:48.262002   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"1c7bc6ed-65e1-49ba-9dd0-0c369b449fe9", APIVersion:"apps/v1", ResourceVersion:"2474", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ldrfv
I0911 06:49:48.363] replicaset.apps/frontend created
I0911 06:49:48.364] apps.sh:596: Successful get rs frontend {{.spec.replicas}}: 3
I0911 06:49:48.442] (Bservice/frontend exposed
... skipping 11 lines ...
I0911 06:49:49.385] apps.sh:616: Successful get rs frontend {{.metadata.generation}}: 4
I0911 06:49:49.468] (Bapps.sh:620: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
I0911 06:49:49.579] (Breplicaset.apps "frontend" deleted
I0911 06:49:49.676] apps.sh:624: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:49.753] (Bapps.sh:628: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:49.894] (Breplicaset.apps/frontend created
W0911 06:49:49.995] E0911 06:49:49.019423   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:49.996] E0911 06:49:49.120470   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:49.996] I0911 06:49:49.897321   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"21ace343-177b-45f4-aa0c-25d8a5e01741", APIVersion:"apps/v1", ResourceVersion:"2509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rd7sg
W0911 06:49:49.997] I0911 06:49:49.899944   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"21ace343-177b-45f4-aa0c-25d8a5e01741", APIVersion:"apps/v1", ResourceVersion:"2509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fs4vd
W0911 06:49:49.997] I0911 06:49:49.901061   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"21ace343-177b-45f4-aa0c-25d8a5e01741", APIVersion:"apps/v1", ResourceVersion:"2509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fmgfh
W0911 06:49:50.021] E0911 06:49:50.020953   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:50.043] I0911 06:49:50.042562   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"redis-slave", UID:"98b9d684-6e4d-4024-b1a3-940300e64c07", APIVersion:"apps/v1", ResourceVersion:"2518", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-nlqk7
W0911 06:49:50.047] I0911 06:49:50.046619   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"redis-slave", UID:"98b9d684-6e4d-4024-b1a3-940300e64c07", APIVersion:"apps/v1", ResourceVersion:"2518", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-bk9rh
W0911 06:49:50.122] E0911 06:49:50.121764   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:50.223] replicaset.apps/redis-slave created
I0911 06:49:50.223] apps.sh:633: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
I0911 06:49:50.224] (Bapps.sh:637: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
I0911 06:49:50.294] (Breplicaset.apps "frontend" deleted
I0911 06:49:50.297] replicaset.apps "redis-slave" deleted
I0911 06:49:50.398] apps.sh:641: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 4 lines ...
I0911 06:49:50.880] apps.sh:652: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
I0911 06:49:50.955] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
I0911 06:49:51.033] horizontalpodautoscaler.autoscaling/frontend autoscaled
W0911 06:49:51.134] I0911 06:49:50.625745   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"ac60d7c1-4ac1-4e66-a0f3-cb17c327317c", APIVersion:"apps/v1", ResourceVersion:"2538", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sclpw
W0911 06:49:51.136] I0911 06:49:50.629314   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"ac60d7c1-4ac1-4e66-a0f3-cb17c327317c", APIVersion:"apps/v1", ResourceVersion:"2538", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lfnmd
W0911 06:49:51.137] I0911 06:49:50.629600   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184583-28530", Name:"frontend", UID:"ac60d7c1-4ac1-4e66-a0f3-cb17c327317c", APIVersion:"apps/v1", ResourceVersion:"2538", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rb77g
W0911 06:49:51.138] E0911 06:49:51.022209   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:51.138] E0911 06:49:51.124594   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:51.239] apps.sh:656: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0911 06:49:51.239] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W0911 06:49:51.340] Error: required flag(s) "max" not set
W0911 06:49:51.341] 
W0911 06:49:51.341] 
W0911 06:49:51.341] Examples:
W0911 06:49:51.341]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0911 06:49:51.341]   kubectl autoscale deployment foo --min=2 --max=10
W0911 06:49:51.342]   
... skipping 42 lines ...
I0911 06:49:52.237] (Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
I0911 06:49:52.399] (Bstatefulset.apps/nginx restarted
I0911 06:49:52.485] apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
I0911 06:49:52.560] (Bstatefulset.apps "nginx" deleted
I0911 06:49:52.654] Waiting for pods: , found nginx-0
W0911 06:49:52.754] I0911 06:49:51.809575   49461 controller.go:606] quota admission added evaluator for: statefulsets.apps
W0911 06:49:52.755] E0911 06:49:52.023527   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:52.755] I0911 06:49:52.069602   53012 event.go:255] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1568184591-29264", Name:"nginx", UID:"90b485ea-949e-43fa-a074-06e3d8ade846", APIVersion:"apps/v1", ResourceVersion:"2563", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
W0911 06:49:52.755] E0911 06:49:52.126413   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:52.755] I0911 06:49:52.558888   53012 stateful_set.go:420] StatefulSet has been deleted namespace-1568184591-29264/nginx
W0911 06:49:53.029] E0911 06:49:53.027798   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:53.129] E0911 06:49:53.128539   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:53.774] +++ exit code: 0
I0911 06:49:53.805] Recording: run_statefulset_history_tests
I0911 06:49:53.805] Running command: run_statefulset_history_tests
I0911 06:49:53.825] 
I0911 06:49:53.827] +++ Running case: test-cmd.run_statefulset_history_tests 
I0911 06:49:53.830] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:49:53.832] +++ command: run_statefulset_history_tests
I0911 06:49:53.848] +++ [0911 06:49:53] Creating namespace namespace-1568184593-21788
I0911 06:49:53.929] namespace/namespace-1568184593-21788 created
I0911 06:49:54.003] Context "test" modified.
I0911 06:49:54.010] +++ [0911 06:49:54] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
I0911 06:49:54.097] apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:49:54.248] (Bstatefulset.apps/nginx created
W0911 06:49:54.349] E0911 06:49:54.029023   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:54.349] E0911 06:49:54.129855   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:54.451] apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1568184593-21788"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
I0911 06:49:54.451]  kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
I0911 06:49:54.452] (Bstatefulset.apps/nginx skipped rollback (current template already matches revision 1)
I0911 06:49:54.548] apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0911 06:49:54.636] (Bapps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0911 06:49:54.796] (Bstatefulset.apps/nginx configured
... skipping 15 lines ...
I0911 06:49:55.274]       -c
I0911 06:49:55.274]       while true; do sleep 1; done
I0911 06:49:55.275]     Environment:	<none>
I0911 06:49:55.275]     Mounts:	<none>
I0911 06:49:55.275]   Volumes:	<none>
I0911 06:49:55.275]  (dry run)
W0911 06:49:55.376] E0911 06:49:55.030338   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:55.378] E0911 06:49:55.131441   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:55.479] apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I0911 06:49:55.537] (Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0911 06:49:55.627] (Bapps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0911 06:49:55.725] (Bstatefulset.apps/nginx rolled back
I0911 06:49:55.811] apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0911 06:49:55.900] (Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0911 06:49:56.005] (BSuccessful
I0911 06:49:56.005] message:error: unable to find specified revision 1000000 in history
I0911 06:49:56.005] has:unable to find specified revision
I0911 06:49:56.092] apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0911 06:49:56.175] (Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0911 06:49:56.276] (Bstatefulset.apps/nginx rolled back
I0911 06:49:56.368] apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I0911 06:49:56.456] (Bapps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 7 lines ...
I0911 06:49:56.781] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:49:56.783] +++ command: run_lists_tests
I0911 06:49:56.794] +++ [0911 06:49:56] Creating namespace namespace-1568184596-16075
I0911 06:49:56.858] namespace/namespace-1568184596-16075 created
I0911 06:49:56.933] Context "test" modified.
I0911 06:49:56.942] +++ [0911 06:49:56] Testing kubectl(v1:lists)
W0911 06:49:57.043] E0911 06:49:56.031521   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:57.044] E0911 06:49:56.132988   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:57.044] I0911 06:49:56.621780   53012 stateful_set.go:420] StatefulSet has been deleted namespace-1568184593-21788/nginx
W0911 06:49:57.045] E0911 06:49:57.033637   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:57.134] I0911 06:49:57.133926   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184596-16075", Name:"list-deployment-test", UID:"03df3b97-9900-4359-9476-f504f4eab418", APIVersion:"apps/v1", ResourceVersion:"2601", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7cd8c5ff6d to 1
W0911 06:49:57.135] E0911 06:49:57.133979   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:57.141] I0911 06:49:57.140857   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184596-16075", Name:"list-deployment-test-7cd8c5ff6d", UID:"ef992e75-d5dd-4663-a7ee-c74bb915cea1", APIVersion:"apps/v1", ResourceVersion:"2602", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: list-deployment-test-7cd8c5ff6d-nnrbh
I0911 06:49:57.242] service/list-service-test created
I0911 06:49:57.243] deployment.apps/list-deployment-test created
I0911 06:49:57.243] service "list-service-test" deleted
I0911 06:49:57.243] deployment.apps "list-deployment-test" deleted
I0911 06:49:57.255] +++ exit code: 0
... skipping 17 lines ...
I0911 06:49:58.107] (BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0911 06:49:58.107] service/mock   ClusterIP   10.0.0.29    <none>        99/TCP    1s
I0911 06:49:58.108] 
I0911 06:49:58.108] NAME                         DESIRED   CURRENT   READY   AGE
I0911 06:49:58.108] replicationcontroller/mock   1         1         0       1s
W0911 06:49:58.209] I0911 06:49:57.834503   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"c59ab25c-120f-4bdc-9836-508b8e5e4a67", APIVersion:"v1", ResourceVersion:"2623", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-hxfmb
W0911 06:49:58.209] E0911 06:49:58.035011   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:49:58.209] E0911 06:49:58.139992   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:49:58.310] Name:              mock
I0911 06:49:58.310] Namespace:         namespace-1568184597-11324
I0911 06:49:58.310] Labels:            app=mock
I0911 06:49:58.310] Annotations:       <none>
I0911 06:49:58.310] Selector:          app=mock
I0911 06:49:58.310] Type:              ClusterIP
... skipping 8 lines ...
I0911 06:49:58.311] Name:         mock
I0911 06:49:58.311] Namespace:    namespace-1568184597-11324
I0911 06:49:58.311] Selector:     app=mock
I0911 06:49:58.311] Labels:       app=mock
I0911 06:49:58.311] Annotations:  <none>
I0911 06:49:58.312] Replicas:     1 current / 1 desired
I0911 06:49:58.312] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0911 06:49:58.312] Pod Template:
I0911 06:49:58.312]   Labels:  app=mock
I0911 06:49:58.312]   Containers:
I0911 06:49:58.312]    mock-container:
I0911 06:49:58.312]     Image:        k8s.gcr.io/pause:2.0
I0911 06:49:58.312]     Port:         9949/TCP
... skipping 35 lines ...
I0911 06:50:00.138] (Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
I0911 06:50:00.232] (BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0911 06:50:00.233] service/mock   ClusterIP   10.0.0.115   <none>        99/TCP    1s
I0911 06:50:00.233] 
I0911 06:50:00.233] NAME                         DESIRED   CURRENT   READY   AGE
I0911 06:50:00.233] replicationcontroller/mock   1         1         0       1s
W0911 06:50:00.334] E0911 06:49:59.036434   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:00.334] E0911 06:49:59.141337   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:00.334] I0911 06:49:59.973127   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"383b7ae3-aac2-4d48-8797-8efb9fe88227", APIVersion:"v1", ResourceVersion:"2661", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-drshw
W0911 06:50:00.335] E0911 06:50:00.038245   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:00.335] E0911 06:50:00.144182   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:00.435] Name:              mock
I0911 06:50:00.435] Namespace:         namespace-1568184597-11324
I0911 06:50:00.436] Labels:            app=mock
I0911 06:50:00.436] Annotations:       <none>
I0911 06:50:00.436] Selector:          app=mock
I0911 06:50:00.436] Type:              ClusterIP
... skipping 8 lines ...
I0911 06:50:00.437] Name:         mock
I0911 06:50:00.438] Namespace:    namespace-1568184597-11324
I0911 06:50:00.438] Selector:     app=mock
I0911 06:50:00.438] Labels:       app=mock
I0911 06:50:00.438] Annotations:  <none>
I0911 06:50:00.438] Replicas:     1 current / 1 desired
I0911 06:50:00.439] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0911 06:50:00.439] Pod Template:
I0911 06:50:00.439]   Labels:  app=mock
I0911 06:50:00.439]   Containers:
I0911 06:50:00.439]    mock-container:
I0911 06:50:00.439]     Image:        k8s.gcr.io/pause:2.0
I0911 06:50:00.439]     Port:         9949/TCP
... skipping 35 lines ...
I0911 06:50:02.147] (BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0911 06:50:02.147] service/mock   ClusterIP   10.0.0.35    <none>        99/TCP    1s
I0911 06:50:02.148] 
I0911 06:50:02.148] NAME                         DESIRED   CURRENT   READY   AGE
I0911 06:50:02.148] replicationcontroller/mock   1         1         0       1s
W0911 06:50:02.249] I0911 06:50:00.575387   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"57080743-7251-4d69-9b21-d90311df6d2c", APIVersion:"v1", ResourceVersion:"2675", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-fsdjz
W0911 06:50:02.249] E0911 06:50:01.039489   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:02.249] E0911 06:50:01.145073   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:02.250] I0911 06:50:01.911555   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"a5fc79ff-5eae-4c53-803d-6b9e0cd03150", APIVersion:"v1", ResourceVersion:"2698", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-7qgr6
W0911 06:50:02.250] E0911 06:50:02.040888   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:02.250] E0911 06:50:02.146101   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:02.351] Name:              mock
I0911 06:50:02.351] Namespace:         namespace-1568184597-11324
I0911 06:50:02.351] Labels:            app=mock
I0911 06:50:02.352] Annotations:       <none>
I0911 06:50:02.352] Selector:          app=mock
I0911 06:50:02.352] Type:              ClusterIP
... skipping 8 lines ...
I0911 06:50:02.353] Name:         mock
I0911 06:50:02.353] Namespace:    namespace-1568184597-11324
I0911 06:50:02.353] Selector:     app=mock
I0911 06:50:02.353] Labels:       app=mock
I0911 06:50:02.354] Annotations:  <none>
I0911 06:50:02.354] Replicas:     1 current / 1 desired
I0911 06:50:02.354] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0911 06:50:02.354] Pod Template:
I0911 06:50:02.354]   Labels:  app=mock
I0911 06:50:02.354]   Containers:
I0911 06:50:02.354]    mock-container:
I0911 06:50:02.355]     Image:        k8s.gcr.io/pause:2.0
I0911 06:50:02.355]     Port:         9949/TCP
... skipping 32 lines ...
I0911 06:50:03.854] replicationcontroller/mock2 created
I0911 06:50:03.942] generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
I0911 06:50:04.009] (BNAME    DESIRED   CURRENT   READY   AGE
I0911 06:50:04.009] mock    1         1         0       1s
I0911 06:50:04.009] mock2   1         1         0       1s
W0911 06:50:04.110] I0911 06:50:02.513672   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"b4c806bd-da20-4a2e-be8d-aa2b4807bfb5", APIVersion:"v1", ResourceVersion:"2713", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-tvlz7
W0911 06:50:04.111] E0911 06:50:03.042067   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:04.111] E0911 06:50:03.147257   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:04.111] I0911 06:50:03.855354   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"9585b109-b6d8-4ccb-9db0-a1f327f3c243", APIVersion:"v1", ResourceVersion:"2732", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-5g2w8
W0911 06:50:04.112] I0911 06:50:03.858050   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock2", UID:"73dd5190-0cc4-4049-897d-27522dff2706", APIVersion:"v1", ResourceVersion:"2733", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-lhrtg
W0911 06:50:04.112] E0911 06:50:04.043462   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:04.149] E0911 06:50:04.148553   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:04.249] Name:         mock
I0911 06:50:04.249] Namespace:    namespace-1568184597-11324
I0911 06:50:04.250] Selector:     app=mock
I0911 06:50:04.250] Labels:       app=mock
I0911 06:50:04.250]               status=replaced
I0911 06:50:04.250] Annotations:  <none>
I0911 06:50:04.250] Replicas:     1 current / 1 desired
I0911 06:50:04.250] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0911 06:50:04.250] Pod Template:
I0911 06:50:04.250]   Labels:  app=mock
I0911 06:50:04.250]   Containers:
I0911 06:50:04.250]    mock-container:
I0911 06:50:04.250]     Image:        k8s.gcr.io/pause:2.0
I0911 06:50:04.251]     Port:         9949/TCP
... skipping 11 lines ...
I0911 06:50:04.252] Namespace:    namespace-1568184597-11324
I0911 06:50:04.252] Selector:     app=mock2
I0911 06:50:04.252] Labels:       app=mock2
I0911 06:50:04.252]               status=replaced
I0911 06:50:04.252] Annotations:  <none>
I0911 06:50:04.252] Replicas:     1 current / 1 desired
I0911 06:50:04.252] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0911 06:50:04.252] Pod Template:
I0911 06:50:04.252]   Labels:  app=mock2
I0911 06:50:04.252]   Containers:
I0911 06:50:04.252]    mock-container:
I0911 06:50:04.252]     Image:        k8s.gcr.io/pause:2.0
I0911 06:50:04.252]     Port:         9949/TCP
... skipping 33 lines ...
I0911 06:50:05.772] generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
I0911 06:50:05.842] (BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0911 06:50:05.843] mock    ClusterIP   10.0.0.212   <none>        99/TCP    0s
I0911 06:50:05.843] mock2   ClusterIP   10.0.0.60    <none>        99/TCP    0s
W0911 06:50:05.943] I0911 06:50:04.342640   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"aa4930d4-d33b-44fb-b287-3889cffa6c10", APIVersion:"v1", ResourceVersion:"2748", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-7h469
W0911 06:50:05.944] I0911 06:50:04.343086   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock2", UID:"9ff388b9-234d-4ce2-8d0f-94284d7ad7b9", APIVersion:"v1", ResourceVersion:"2749", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-9w2fm
W0911 06:50:05.944] E0911 06:50:05.044575   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:05.944] E0911 06:50:05.150126   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:05.944] I0911 06:50:05.790863   53012 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1568184583-28530
I0911 06:50:06.045] Name:              mock
I0911 06:50:06.045] Namespace:         namespace-1568184597-11324
I0911 06:50:06.045] Labels:            app=mock
I0911 06:50:06.045] Annotations:       <none>
I0911 06:50:06.045] Selector:          app=mock
... skipping 60 lines ...
I0911 06:50:08.636] Context "test" modified.
I0911 06:50:08.642] +++ [0911 06:50:08] Testing persistent volumes
I0911 06:50:08.721] storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:50:08.857] (Bpersistentvolume/pv0001 created
I0911 06:50:08.942] storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
I0911 06:50:09.008] (Bpersistentvolume "pv0001" deleted
W0911 06:50:09.109] E0911 06:50:06.048175   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:09.109] E0911 06:50:06.151152   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:09.110] E0911 06:50:07.049661   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:09.110] E0911 06:50:07.152361   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:09.110] I0911 06:50:07.824122   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184597-11324", Name:"mock", UID:"bef730bf-aac7-47b0-a72e-1f34395d2ed8", APIVersion:"v1", ResourceVersion:"2810", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-5vnvd
W0911 06:50:09.111] E0911 06:50:08.051132   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:09.111] E0911 06:50:08.154577   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:09.111] E0911 06:50:09.052387   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:09.156] E0911 06:50:09.155701   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:09.256] persistentvolume/pv0002 created
I0911 06:50:09.257] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I0911 06:50:09.294] (Bpersistentvolume "pv0002" deleted
I0911 06:50:09.430] persistentvolume/pv0003 created
I0911 06:50:09.518] storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
I0911 06:50:09.597] (Bpersistentvolume "pv0003" deleted
... skipping 21 lines ...
I0911 06:50:10.262] Context "test" modified.
I0911 06:50:10.268] +++ [0911 06:50:10] Testing persistent volumes claims
I0911 06:50:10.351] storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:50:10.499] (Bpersistentvolumeclaim/myclaim-1 created
I0911 06:50:10.583] storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
I0911 06:50:10.649] (Bpersistentvolumeclaim "myclaim-1" deleted
W0911 06:50:10.750] E0911 06:50:10.053623   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:10.750] E0911 06:50:10.157254   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:10.751] I0911 06:50:10.500475   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-1", UID:"63f358f4-a1d9-42e3-9989-c599d961fdf5", APIVersion:"v1", ResourceVersion:"2847", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0911 06:50:10.751] I0911 06:50:10.503156   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-1", UID:"63f358f4-a1d9-42e3-9989-c599d961fdf5", APIVersion:"v1", ResourceVersion:"2849", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0911 06:50:10.751] I0911 06:50:10.648567   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-1", UID:"63f358f4-a1d9-42e3-9989-c599d961fdf5", APIVersion:"v1", ResourceVersion:"2851", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0911 06:50:10.790] I0911 06:50:10.789316   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-2", UID:"39f97618-472b-48c7-8ca5-14f94095bec1", APIVersion:"v1", ResourceVersion:"2854", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0911 06:50:10.792] I0911 06:50:10.791524   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-2", UID:"39f97618-472b-48c7-8ca5-14f94095bec1", APIVersion:"v1", ResourceVersion:"2856", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0911 06:50:10.892] persistentvolumeclaim/myclaim-2 created
I0911 06:50:10.892] storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
I0911 06:50:10.941] (Bpersistentvolumeclaim "myclaim-2" deleted
W0911 06:50:11.042] I0911 06:50:10.941099   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-2", UID:"39f97618-472b-48c7-8ca5-14f94095bec1", APIVersion:"v1", ResourceVersion:"2858", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0911 06:50:11.055] E0911 06:50:11.054821   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:11.084] I0911 06:50:11.083404   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-3", UID:"0e9f3431-e4ff-4a2b-88b7-fd8a017e3023", APIVersion:"v1", ResourceVersion:"2861", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0911 06:50:11.086] I0911 06:50:11.085956   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-3", UID:"0e9f3431-e4ff-4a2b-88b7-fd8a017e3023", APIVersion:"v1", ResourceVersion:"2863", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0911 06:50:11.158] E0911 06:50:11.158333   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:11.244] I0911 06:50:11.243944   53012 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1568184610-14130", Name:"myclaim-3", UID:"0e9f3431-e4ff-4a2b-88b7-fd8a017e3023", APIVersion:"v1", ResourceVersion:"2865", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0911 06:50:11.345] persistentvolumeclaim/myclaim-3 created
I0911 06:50:11.346] storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
I0911 06:50:11.346] (Bpersistentvolumeclaim "myclaim-3" deleted
I0911 06:50:11.346] storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:50:11.347] (B+++ exit code: 0
... skipping 186 lines ...
I0911 06:50:12.614]   --------           --------  ------
I0911 06:50:12.614]   cpu                0 (0%)    0 (0%)
I0911 06:50:12.614]   memory             0 (0%)    0 (0%)
I0911 06:50:12.614]   ephemeral-storage  0 (0%)    0 (0%)
I0911 06:50:12.614] Events:              <none>
I0911 06:50:12.614] (B
W0911 06:50:12.715] E0911 06:50:12.056025   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:12.715] E0911 06:50:12.159574   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:12.816] Successful describe nodes:
I0911 06:50:12.816] Name:               127.0.0.1
I0911 06:50:12.816] Roles:              <none>
I0911 06:50:12.816] Labels:             <none>
I0911 06:50:12.816] Annotations:        node.alpha.kubernetes.io/ttl: 0
I0911 06:50:12.816] CreationTimestamp:  Wed, 11 Sep 2019 06:46:21 +0000
... skipping 233 lines ...
I0911 06:50:14.116] yes
I0911 06:50:14.117] has:the server doesn't have a resource type
I0911 06:50:14.187] Successful
I0911 06:50:14.188] message:yes
I0911 06:50:14.188] has:yes
I0911 06:50:14.260] Successful
I0911 06:50:14.260] message:error: --subresource can not be used with NonResourceURL
I0911 06:50:14.261] has:subresource can not be used with NonResourceURL
I0911 06:50:14.332] Successful
I0911 06:50:14.407] Successful
I0911 06:50:14.407] message:yes
I0911 06:50:14.407] 0
I0911 06:50:14.408] has:0
... skipping 27 lines ...
I0911 06:50:14.962] role.rbac.authorization.k8s.io/testing-R reconciled
I0911 06:50:15.047] legacy-script.sh:797: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I0911 06:50:15.134] (Blegacy-script.sh:798: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I0911 06:50:15.225] (Blegacy-script.sh:799: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I0911 06:50:15.313] (Blegacy-script.sh:800: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I0911 06:50:15.392] (BSuccessful
I0911 06:50:15.393] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I0911 06:50:15.393] has:only rbac.authorization.k8s.io/v1 is supported
I0911 06:50:15.486] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I0911 06:50:15.495] role.rbac.authorization.k8s.io "testing-R" deleted
I0911 06:50:15.509] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
I0911 06:50:15.521] clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
I0911 06:50:15.537] Recording: run_retrieve_multiple_tests
... skipping 13 lines ...
I0911 06:50:15.827] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:50:15.830] +++ command: run_resource_aliasing_tests
I0911 06:50:15.840] +++ [0911 06:50:15] Creating namespace namespace-1568184615-20871
I0911 06:50:15.915] namespace/namespace-1568184615-20871 created
I0911 06:50:15.988] Context "test" modified.
I0911 06:50:15.994] +++ [0911 06:50:15] Testing resource aliasing
W0911 06:50:16.095] E0911 06:50:13.057464   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:16.095] E0911 06:50:13.160762   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:16.095]   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
W0911 06:50:16.096]                                  Dload  Upload   Total   Spent    Left  Speed
W0911 06:50:16.096] 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   826  100   524  100   302   127k  75500 --:--:-- --:--:-- --:--:--  201k
W0911 06:50:16.096]   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
W0911 06:50:16.096]                                  Dload  Upload   Total   Spent    Left  Speed
W0911 06:50:16.096] 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   818  100   520  100   298   101k  59600 --:--:-- --:--:-- --:--:--  159k
W0911 06:50:16.096] E0911 06:50:14.058762   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:16.097] E0911 06:50:14.161977   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:16.097] 	reconciliation required create
W0911 06:50:16.097] 	missing rules added:
W0911 06:50:16.097] 		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
W0911 06:50:16.097] 	reconciliation required create
W0911 06:50:16.097] 	missing subjects added:
W0911 06:50:16.097] 		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
W0911 06:50:16.097] 	reconciliation required create
W0911 06:50:16.097] 	missing subjects added:
W0911 06:50:16.098] 		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
W0911 06:50:16.098] 	reconciliation required create
W0911 06:50:16.098] 	missing rules added:
W0911 06:50:16.098] 		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
W0911 06:50:16.098] E0911 06:50:15.059908   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:16.098] E0911 06:50:15.163513   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:16.098] warning: deleting cluster-scoped resources, not scoped to the provided namespace
W0911 06:50:16.099] E0911 06:50:16.061032   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:16.136] I0911 06:50:16.135627   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184615-20871", Name:"cassandra", UID:"0df01b8e-e089-4fdc-bc73-093e9d133095", APIVersion:"v1", ResourceVersion:"2889", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-sjlnj
W0911 06:50:16.139] I0911 06:50:16.138707   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184615-20871", Name:"cassandra", UID:"0df01b8e-e089-4fdc-bc73-093e9d133095", APIVersion:"v1", ResourceVersion:"2889", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-wdmzf
W0911 06:50:16.165] E0911 06:50:16.164682   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:16.265] replicationcontroller/cassandra created
I0911 06:50:16.266] service/cassandra created
I0911 06:50:16.374] discovery.sh:89: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
I0911 06:50:16.457] (Bpod "cassandra-sjlnj" deleted
I0911 06:50:16.462] pod "cassandra-wdmzf" deleted
I0911 06:50:16.473] replicationcontroller "cassandra" deleted
... skipping 79 lines ...
I0911 06:50:16.998] 
I0911 06:50:16.998] FIELD:    message <string>
I0911 06:50:16.998] 
I0911 06:50:16.998] DESCRIPTION:
I0911 06:50:16.998]      A human readable message indicating details about why the pod is in this
I0911 06:50:16.998]      condition.
W0911 06:50:17.099] E0911 06:50:17.062442   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:17.166] E0911 06:50:17.165960   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:17.267] KIND:     CronJob
I0911 06:50:17.268] VERSION:  batch/v1beta1
I0911 06:50:17.268] 
I0911 06:50:17.268] DESCRIPTION:
I0911 06:50:17.268]      CronJob represents the configuration of a single cron job.
I0911 06:50:17.268] 
... skipping 81 lines ...
I0911 06:50:18.220] (Bpod "valid-pod" force deleted
I0911 06:50:18.309] get.sh:287: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:50:18.417] (Bget.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:50:18.561] (Bpod/sorted-pod1 created
W0911 06:50:18.662] No resources found in namespace-1568184615-20871 namespace.
W0911 06:50:18.663] No resources found in namespace-1568184615-20871 namespace.
W0911 06:50:18.665] E0911 06:50:18.063515   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:18.668] E0911 06:50:18.167098   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:18.668] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0911 06:50:18.769] get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
I0911 06:50:18.819] (Bpod/sorted-pod2 created
I0911 06:50:18.899] get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
I0911 06:50:19.046] (Bpod/sorted-pod3 created
W0911 06:50:19.147] E0911 06:50:19.064732   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:19.169] E0911 06:50:19.168697   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:19.270] get.sh:304: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
I0911 06:50:19.278] (BSuccessful
I0911 06:50:19.279] message:sorted-pod1:sorted-pod2:sorted-pod3:
I0911 06:50:19.279] has:sorted-pod1:sorted-pod2:sorted-pod3:
I0911 06:50:19.355] Successful
I0911 06:50:19.355] message:sorted-pod3:sorted-pod2:sorted-pod1:
... skipping 170 lines ...
I0911 06:50:20.830] namespace-1568184610-14130   default   0         10s
I0911 06:50:20.830] namespace-1568184615-20871   default   0         5s
I0911 06:50:20.830] some-other-random            default   0         6s
I0911 06:50:20.830] has:all-ns-test-2
I0911 06:50:20.895] namespace "all-ns-test-1" deleted
W0911 06:50:20.996] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:50:20.996] E0911 06:50:20.066144   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:20.996] E0911 06:50:20.169885   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:21.068] E0911 06:50:21.067512   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:21.171] E0911 06:50:21.171134   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:22.069] E0911 06:50:22.069077   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:22.172] E0911 06:50:22.172344   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:23.071] E0911 06:50:23.070388   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:23.174] E0911 06:50:23.173798   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:24.072] E0911 06:50:24.071726   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:24.175] E0911 06:50:24.174834   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:25.073] E0911 06:50:25.073153   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:25.176] E0911 06:50:25.176212   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:26.066] namespace "all-ns-test-2" deleted
W0911 06:50:26.166] E0911 06:50:26.074184   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:26.178] E0911 06:50:26.177474   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:27.076] E0911 06:50:27.075524   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:27.179] E0911 06:50:27.178840   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:28.077] E0911 06:50:28.076874   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:28.180] E0911 06:50:28.180123   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:29.078] E0911 06:50:29.078088   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:29.182] E0911 06:50:29.181438   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:30.079] E0911 06:50:30.079376   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:30.187] E0911 06:50:30.186779   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:30.987] I0911 06:50:30.987171   53012 namespace_controller.go:171] Namespace has been deleted all-ns-test-1
W0911 06:50:31.080] E0911 06:50:31.080095   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:31.189] E0911 06:50:31.188257   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:31.290] get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0911 06:50:31.325] (Bpod "valid-pod" force deleted
I0911 06:50:31.411] get.sh:384: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0911 06:50:31.498] (Bget.sh:388: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
I0911 06:50:31.576] (BSuccessful
I0911 06:50:31.577] message:NAME        STATUS     ROLES    AGE     VERSION
... skipping 77 lines ...
I0911 06:50:32.397] message:valid-pod:
I0911 06:50:32.397] has:valid-pod:
I0911 06:50:32.478] Successful
I0911 06:50:32.479] message:valid-pod:
I0911 06:50:32.479] has:valid-pod:
W0911 06:50:32.579] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:50:32.579] E0911 06:50:32.081319   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:32.580] E0911 06:50:32.189505   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:32.680] Successful
I0911 06:50:32.680] message:valid-pod:
I0911 06:50:32.680] has:valid-pod:
I0911 06:50:32.810] Successful
I0911 06:50:32.810] message:valid-pod:
I0911 06:50:32.810] has:valid-pod:
... skipping 12 lines ...
I0911 06:50:33.191] Successful
I0911 06:50:33.192] message:127.0.0.1:
I0911 06:50:33.192] has:127.0.0.1:
I0911 06:50:33.263] node/127.0.0.1 untainted
W0911 06:50:33.364] kubectl convert is DEPRECATED and will be removed in a future version.
W0911 06:50:33.364] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W0911 06:50:33.365] E0911 06:50:33.082677   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:33.365] kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0911 06:50:33.365] E0911 06:50:33.190537   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:33.408] I0911 06:50:33.407505   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184631-100", Name:"cassandra", UID:"2b2585d9-d980-4a2e-8240-2fdcde9e7399", APIVersion:"v1", ResourceVersion:"2961", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-lqjr5
W0911 06:50:33.410] I0911 06:50:33.409558   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184631-100", Name:"cassandra", UID:"2b2585d9-d980-4a2e-8240-2fdcde9e7399", APIVersion:"v1", ResourceVersion:"2961", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-vptw5
I0911 06:50:33.510] replicationcontroller/cassandra created
I0911 06:50:33.511] Successful
I0911 06:50:33.511] message:cassandra:
I0911 06:50:33.512] has:cassandra:
... skipping 144 lines ...
I0911 06:50:36.802] Running command: run_certificates_tests
I0911 06:50:36.823] 
I0911 06:50:36.825] +++ Running case: test-cmd.run_certificates_tests 
I0911 06:50:36.828] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:50:36.830] +++ command: run_certificates_tests
I0911 06:50:36.838] +++ [0911 06:50:36] Testing certificates
W0911 06:50:36.939] E0911 06:50:34.083979   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:36.940] E0911 06:50:34.192033   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:36.940] E0911 06:50:35.085563   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:36.940] E0911 06:50:35.193446   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:36.940] E0911 06:50:36.087070   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:36.941] I0911 06:50:36.144922   53012 namespace_controller.go:171] Namespace has been deleted all-ns-test-2
W0911 06:50:36.941] E0911 06:50:36.194948   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:36.941] I0911 06:50:36.394475   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184631-100", Name:"cassandra", UID:"2b2585d9-d980-4a2e-8240-2fdcde9e7399", APIVersion:"v1", ResourceVersion:"2967", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-ndd79
W0911 06:50:36.941] I0911 06:50:36.398957   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1568184631-100", Name:"cassandra", UID:"2b2585d9-d980-4a2e-8240-2fdcde9e7399", APIVersion:"v1", ResourceVersion:"2967", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-lptwz
W0911 06:50:36.942] I0911 06:50:36.406029   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184631-100", Name:"deploy-74bcc58696", UID:"d818861b-3b88-44bc-aaeb-630356fc4285", APIVersion:"apps/v1", ResourceVersion:"2978", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-h9ngl
I0911 06:50:37.042] certificatesigningrequest.certificates.k8s.io/foo created
I0911 06:50:37.048] certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
I0911 06:50:37.127] (Bcertificatesigningrequest.certificates.k8s.io/foo approved
... skipping 39 lines ...
I0911 06:50:37.209]         "resourceVersion": "",
I0911 06:50:37.209]         "selfLink": ""
I0911 06:50:37.209]     }
I0911 06:50:37.209] }
I0911 06:50:37.286] certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
I0911 06:50:37.353] (Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
W0911 06:50:37.455] E0911 06:50:37.088704   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:37.456] E0911 06:50:37.196261   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:37.557] certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
I0911 06:50:37.623] (Bcertificatesigningrequest.certificates.k8s.io/foo created
I0911 06:50:37.710] certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
I0911 06:50:37.787] (Bcertificatesigningrequest.certificates.k8s.io/foo approved
I0911 06:50:37.862] {
I0911 06:50:37.863]     "apiVersion": "v1",
... skipping 146 lines ...
I0911 06:50:39.518] +++ Running case: test-cmd.run_cluster_management_tests 
I0911 06:50:39.520] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:50:39.523] +++ command: run_cluster_management_tests
I0911 06:50:39.532] +++ [0911 06:50:39] Testing cluster-management commands
I0911 06:50:39.612] node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
I0911 06:50:39.738] (Bpod/test-pod-1 created
W0911 06:50:39.839] E0911 06:50:38.094586   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:39.840] E0911 06:50:38.197703   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:39.840] E0911 06:50:39.097210   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:39.840] E0911 06:50:39.199418   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:39.941] pod/test-pod-2 created
I0911 06:50:39.958] node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
I0911 06:50:40.028] (Bnode/127.0.0.1 tainted
I0911 06:50:40.112] node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
I0911 06:50:40.183] (Bnode/127.0.0.1 untainted
I0911 06:50:40.267] node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
... skipping 18 lines ...
I0911 06:50:41.603] message:node/127.0.0.1 already uncordoned (dry run)
I0911 06:50:41.604] has:already uncordoned
I0911 06:50:41.683] node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
I0911 06:50:41.751] (Bnode/127.0.0.1 labeled
I0911 06:50:41.836] node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
I0911 06:50:41.906] (BSuccessful
I0911 06:50:41.906] message:error: cannot specify both a node name and a --selector option
I0911 06:50:41.906] See 'kubectl drain -h' for help and examples
I0911 06:50:41.906] has:cannot specify both a node name
I0911 06:50:41.966] Successful
I0911 06:50:41.967] message:error: USAGE: cordon NODE [flags]
I0911 06:50:41.967] See 'kubectl cordon -h' for help and examples
I0911 06:50:41.967] has:error\: USAGE\: cordon NODE
I0911 06:50:42.032] node/127.0.0.1 already uncordoned
I0911 06:50:42.109] Successful
I0911 06:50:42.110] message:error: You must provide one or more resources by argument or filename.
I0911 06:50:42.110] Example resource specifications include:
I0911 06:50:42.110]    '-f rsrc.yaml'
I0911 06:50:42.110]    '--filename=rsrc.json'
I0911 06:50:42.110]    '<resource> <name>'
I0911 06:50:42.110]    '<resource>'
I0911 06:50:42.110] has:must provide one or more resources
... skipping 15 lines ...
I0911 06:50:42.529] Successful
I0911 06:50:42.530] message:The following compatible plugins are available:
I0911 06:50:42.531] 
I0911 06:50:42.531] test/fixtures/pkg/kubectl/plugins/version/kubectl-version
I0911 06:50:42.531]   - warning: kubectl-version overwrites existing command: "kubectl version"
I0911 06:50:42.531] 
I0911 06:50:42.532] error: one plugin warning was found
I0911 06:50:42.532] has:kubectl-version overwrites existing command: "kubectl version"
I0911 06:50:42.608] Successful
I0911 06:50:42.608] message:The following compatible plugins are available:
I0911 06:50:42.608] 
I0911 06:50:42.608] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0911 06:50:42.609] test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
I0911 06:50:42.609]   - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0911 06:50:42.609] 
I0911 06:50:42.609] error: one plugin warning was found
I0911 06:50:42.609] has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
I0911 06:50:42.674] Successful
I0911 06:50:42.674] message:The following compatible plugins are available:
I0911 06:50:42.674] 
I0911 06:50:42.674] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0911 06:50:42.674] has:plugins are available
I0911 06:50:42.749] Successful
I0911 06:50:42.750] message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
I0911 06:50:42.750] error: unable to find any kubectl plugins in your PATH
I0911 06:50:42.750] has:unable to find any kubectl plugins in your PATH
I0911 06:50:42.812] Successful
I0911 06:50:42.813] message:I am plugin foo
I0911 06:50:42.813] has:plugin foo
I0911 06:50:42.887] Successful
I0911 06:50:42.887] message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 12 lines ...
I0911 06:50:43.044] 
I0911 06:50:43.047] +++ Running case: test-cmd.run_impersonation_tests 
I0911 06:50:43.049] +++ working dir: /go/src/k8s.io/kubernetes
I0911 06:50:43.052] +++ command: run_impersonation_tests
I0911 06:50:43.060] +++ [0911 06:50:43] Testing impersonation
I0911 06:50:43.130] Successful
I0911 06:50:43.130] message:error: requesting groups or user-extra for  without impersonating a user
I0911 06:50:43.130] has:without impersonating a user
W0911 06:50:43.231] E0911 06:50:40.099043   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:43.231] E0911 06:50:40.200548   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:43.231] E0911 06:50:41.099825   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:43.232] E0911 06:50:41.201717   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:43.232] E0911 06:50:42.101038   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:43.232] E0911 06:50:42.203771   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:43.232] E0911 06:50:43.102221   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:43.232] E0911 06:50:43.204837   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0911 06:50:43.337] certificatesigningrequest.certificates.k8s.io/foo created
I0911 06:50:43.425] authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
I0911 06:50:43.505] (Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
I0911 06:50:43.579] (Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
I0911 06:50:43.736] certificatesigningrequest.certificates.k8s.io/foo created
I0911 06:50:43.821] authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
... skipping 8 lines ...
I0911 06:50:44.051] +++ command: run_wait_tests
I0911 06:50:44.060] +++ [0911 06:50:44] Testing kubectl wait
I0911 06:50:44.064] +++ [0911 06:50:44] Creating namespace namespace-1568184644-1182
I0911 06:50:44.133] namespace/namespace-1568184644-1182 created
I0911 06:50:44.197] Context "test" modified.
I0911 06:50:44.274] deployment.apps/test-1 created
W0911 06:50:44.375] E0911 06:50:44.103549   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:44.376] E0911 06:50:44.206065   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:44.376] I0911 06:50:44.281844   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184644-1182", Name:"test-1", UID:"ae07cc62-b9af-47c8-95b3-50ffce569fc8", APIVersion:"apps/v1", ResourceVersion:"3057", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
W0911 06:50:44.377] I0911 06:50:44.297136   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184644-1182", Name:"test-1-6d98955cc9", UID:"81a5a5a3-87e0-4ec5-a1c9-f522b7fe9789", APIVersion:"apps/v1", ResourceVersion:"3058", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-4x72f
W0911 06:50:44.391] I0911 06:50:44.390252   53012 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1568184644-1182", Name:"test-2", UID:"993a4a29-857d-4c28-982c-b9bc0002a251", APIVersion:"apps/v1", ResourceVersion:"3067", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
W0911 06:50:44.394] I0911 06:50:44.393557   53012 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1568184644-1182", Name:"test-2-65897ff84d", UID:"f0ff5c3d-aa4f-4ddb-a5e2-52e5d814a173", APIVersion:"apps/v1", ResourceVersion:"3068", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-j2hpk
I0911 06:50:44.495] deployment.apps/test-2 created
I0911 06:50:44.495] wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
... skipping 5 lines ...
I0911 06:50:46.581] has:test-1 condition met
I0911 06:50:46.583] Successful
I0911 06:50:46.583] message:deployment.apps/test-1 condition met
I0911 06:50:46.583] deployment.apps/test-2 condition met
I0911 06:50:46.583] has:test-2 condition met
I0911 06:50:46.595] +++ exit code: 0
W0911 06:50:46.696] E0911 06:50:45.104835   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:46.696] E0911 06:50:45.207479   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:46.697] E0911 06:50:46.106966   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:46.697] E0911 06:50:46.208881   53012 reflector.go:123] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0911 06:50:46.701] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:50:46.781] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0911 06:50:46.797] I0911 06:50:46.797102   49461 secure_serving.go:167] Stopped listening on 127.0.0.1:8080
W0911 06:50:46.798] I0911 06:50:46.797203   49461 controller.go:182] Shutting down kubernetes service endpoint reconciler
W0911 06:50:46.800] W0911 06:50:46.797435   49461 reflector.go:299] k8s.io/client-go/informers/factory.go:134: watch of *v1.Namespace ended with: very short watch: k8s.io/client-go/informers/factory.go:134: Unexpected watch close - watch lasted less than a second and no items received
W0911 06:50:46.800] I0911 06:50:46.797589   49461 controller.go:122] Shutting down OpenAPI controller
... skipping 13 lines ...
W0911 06:50:46.806] I0911 06:50:46.798027   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.807] I0911 06:50:46.798468   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.807] I0911 06:50:46.798478   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.807] I0911 06:50:46.798687   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.808] I0911 06:50:46.798687   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.808] I0911 06:50:46.798857   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.808] W0911 06:50:46.798878   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.809] I0911 06:50:46.798912   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.809] I0911 06:50:46.799017   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.809] W0911 06:50:46.799063   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.810] I0911 06:50:46.799106   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.810] I0911 06:50:46.799137   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.810] I0911 06:50:46.799177   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.811] I0911 06:50:46.799201   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.811] W0911 06:50:46.799246   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.811] I0911 06:50:46.799271   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.812] I0911 06:50:46.799282   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.812] I0911 06:50:46.799387   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.812] I0911 06:50:46.799406   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.813] I0911 06:50:46.799426   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.813] I0911 06:50:46.799501   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.813] I0911 06:50:46.799507   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.814] W0911 06:50:46.799553   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.814] W0911 06:50:46.799632   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.814] I0911 06:50:46.799762   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.815] I0911 06:50:46.799881   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.815] I0911 06:50:46.799894   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.815] W0911 06:50:46.799938   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.816] I0911 06:50:46.799986   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.816] I0911 06:50:46.800062   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.817] W0911 06:50:46.799557   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.817] W0911 06:50:46.800107   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.818] W0911 06:50:46.800356   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.818] I0911 06:50:46.800358   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.819] I0911 06:50:46.800430   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.819] I0911 06:50:46.800475   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.819] I0911 06:50:46.800502   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.820] I0911 06:50:46.800585   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.820] I0911 06:50:46.800593   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
... skipping 7 lines ...
W0911 06:50:46.823] I0911 06:50:46.801275   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.823] I0911 06:50:46.801435   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.824] I0911 06:50:46.801480   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.824] I0911 06:50:46.801521   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.825] I0911 06:50:46.801650   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.825] I0911 06:50:46.801681   49461 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0911 06:50:46.825] W0911 06:50:46.801707   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.826] W0911 06:50:46.801771   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.826] W0911 06:50:46.801781   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.827] W0911 06:50:46.801807   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.827] W0911 06:50:46.801828   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.828] W0911 06:50:46.801834   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.828] W0911 06:50:46.801859   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.828] W0911 06:50:46.801885   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.829] W0911 06:50:46.801891   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.829] W0911 06:50:46.801909   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.830] W0911 06:50:46.801938   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.830] W0911 06:50:46.801948   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.830] W0911 06:50:46.801956   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.831] W0911 06:50:46.801987   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.832] W0911 06:50:46.802000   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.832] W0911 06:50:46.802016   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.833] W0911 06:50:46.802022   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.833] W0911 06:50:46.802051   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.834] W0911 06:50:46.802043   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.834] W0911 06:50:46.802083   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.835] W0911 06:50:46.802094   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.835] W0911 06:50:46.802105   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.836] W0911 06:50:46.801985   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.836] W0911 06:50:46.802111   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.836] W0911 06:50:46.802052   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.837] W0911 06:50:46.802083   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.837] W0911 06:50:46.802157   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.838] W0911 06:50:46.802194   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.838] W0911 06:50:46.802217   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.839] W0911 06:50:46.802229   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.839] E0911 06:50:46.802240   49461 controller.go:185] rpc error: code = Unavailable desc = transport is closing
W0911 06:50:46.839] W0911 06:50:46.802264   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.839] W0911 06:50:46.802162   49461 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0911 06:50:46.853] + make test-integration
I0911 06:50:46.953] No resources found
I0911 06:50:46.953] No resources found
I0911 06:50:46.954] +++ [0911 06:50:46] TESTS PASSED
I0911 06:50:46.954] junit report dir: /workspace/artifacts
I0911 06:50:46.954] +++ [0911 06:50:46] Clean up complete
... skipping 323 lines ...
I0911 07:02:55.604]     synthetic_master_test.go:755: UPDATE_NODE_APISERVER is not set
I0911 07:02:55.604] 
I0911 07:02:55.604] === SKIP: test/integration/scheduler_perf TestSchedule100Node3KPods (0.00s)
I0911 07:02:55.604]     scheduler_test.go:73: Skipping because we want to run short tests
I0911 07:02:55.604] 
I0911 07:02:55.604] 
I0911 07:02:55.605] === Failed
I0911 07:02:55.605] === FAIL: test/integration/examples TestAggregatedAPIServer (11.86s)
I0911 07:02:55.605] I0911 06:55:01.883936  107510 serving.go:312] Generated self-signed cert (/tmp/test-integration-apiserver464142136/apiserver.crt, /tmp/test-integration-apiserver464142136/apiserver.key)
I0911 07:02:55.605] I0911 06:55:01.883979  107510 server.go:623] external host was not specified, using 172.17.0.2
I0911 07:02:55.605] W0911 06:55:02.360080  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 07:02:55.606] W0911 06:55:02.360133  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 07:02:55.606] W0911 06:55:02.360146  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0911 07:02:55.606] W0911 06:55:02.360655  107510 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
... skipping 200 lines ...
I0911 07:02:55.643]     apiserver_test.go:453: {"kind":"APIGroupList","groups":[{"name":"wardle.k8s.io","versions":[{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"},{"groupVersion":"wardle.k8s.io/v1alpha1","version":"v1alpha1"}],"preferredVersion":{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"},"serverAddressByClientCIDRs":[{"clientCIDR":"0.0.0.0/0","serverAddress":":35253"}]}]}
I0911 07:02:55.643]         
I0911 07:02:55.644]     apiserver_test.go:482: {"kind":"APIGroup","apiVersion":"v1","name":"wardle.k8s.io","versions":[{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"},{"groupVersion":"wardle.k8s.io/v1alpha1","version":"v1alpha1"}],"preferredVersion":{"groupVersion":"wardle.k8s.io/v1beta1","version":"v1beta1"}}
I0911 07:02:55.644]         
I0911 07:02:55.644]     apiserver_test.go:500: {"kind":"APIResourceList","apiVersion":"v1","groupVersion":"wardle.k8s.io/v1alpha1","resources":[{"name":"fischers","singularName":"","namespaced":false,"kind":"Fischer","verbs":["create","delete","deletecollection","get","list","patch","update","watch"],"storageVersionHash":"u0hTAhBTXHw="},{"name":"flunders","singularName":"","namespaced":true,"kind":"Flunder","verbs":["create","delete","deletecollection","get","list","patch","update","watch"],"storageVersionHash":"k36Bkt6yJrQ="}]}
I0911 07:02:55.644]         
I0911 07:02:55.644]     apiserver_test.go:382: Discovery call expected to return failed unavailable service
I0911 07:02:55.645]     apiserver_test.go:374: Discovery call didn't return expected error: <nil>
I0911 07:02:55.645] 
I0911 07:02:55.645] 
I0911 07:02:55.645] DONE 2745 tests, 4 skipped, 1 failure in 6.174s
I0911 07:02:55.645] +++ [0911 07:02:55] Saved JUnit XML test report to /workspace/artifacts/junit_d965d8661547eb73cabe6d94d5550ec333e4c0fa_20190911-065055.xml
I0911 07:02:55.657] +++ [0911 07:02:55] Cleaning up etcd
W0911 07:02:55.758] make[1]: *** [Makefile:185: test] Error 1
W0911 07:02:55.758] !!! [0911 07:02:55] Call tree:
W0911 07:02:55.758] !!! [0911 07:02:55]  1: hack/make-rules/test-integration.sh:89 runTests(...)
I0911 07:02:55.873] +++ [0911 07:02:55] Integration test cleanup complete
W0911 07:02:55.974] make: *** [Makefile:204: test-integration] Error 1
W0911 07:02:57.361] Traceback (most recent call last):
W0911 07:02:57.361]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 178, in <module>
W0911 07:02:57.361]     ARGS.exclude_typecheck, ARGS.exclude_godep)
W0911 07:02:57.361]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 140, in main
W0911 07:02:57.362]     check(*cmd)
W0911 07:02:57.362]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W0911 07:02:57.362]     subprocess.check_call(cmd)
W0911 07:02:57.362]   File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
W0911 07:02:57.365]     raise CalledProcessError(retcode, cmd)
W0911 07:02:57.366] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=y', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'EXCLUDE_TYPECHECK=n', '-e', 'EXCLUDE_GODEP=n', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.14-v20190817-cc05229', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E0911 07:02:57.370] Command failed
I0911 07:02:57.371] process 490 exited with code 1 after 25.7m
E0911 07:02:57.371] FAIL: ci-kubernetes-integration-master
I0911 07:02:57.371] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W0911 07:02:57.870] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I0911 07:02:57.920] process 112545 exited with code 0 after 0.0m
I0911 07:02:57.921] Call:  gcloud config get-value account
I0911 07:02:58.244] process 112557 exited with code 0 after 0.0m
I0911 07:02:58.244] Will upload results to gs://kubernetes-jenkins/logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I0911 07:02:58.244] Upload result and artifacts...
I0911 07:02:58.244] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/logs/ci-kubernetes-integration-master/1171673756569964544
I0911 07:02:58.245] Call:  gsutil ls gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/1171673756569964544/artifacts
W0911 07:02:59.318] CommandException: One or more URLs matched no objects.
E0911 07:02:59.433] Command failed
I0911 07:02:59.433] process 112569 exited with code 1 after 0.0m
W0911 07:02:59.433] Remote dir gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/1171673756569964544/artifacts not exist yet
I0911 07:02:59.434] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/1171673756569964544/artifacts
I0911 07:03:05.075] process 112711 exited with code 0 after 0.1m
W0911 07:03:05.076] metadata path /workspace/_artifacts/metadata.json does not exist
W0911 07:03:05.076] metadata not found or invalid, init with empty metadata
... skipping 15 lines ...