This job view page is being replaced by Spyglass soon. Check out the new job view.
PRwojtek-t: [WIP][DO NOT REVIEW] Deprecate SelfLink field
ResultFAILURE
Tests 1 failed / 2448 succeeded
Started2019-08-14 13:02
Elapsed25m6s
Revision
Buildergke-prow-ssd-pool-1a225945-n4mt
Refs master:34791349
80640:c74c008c
podb55cf81c-be93-11e9-8f48-b2e5472b16c0
infra-commit6e5b38c23
podb55cf81c-be93-11e9-8f48-b2e5472b16c0
repok8s.io/kubernetes
repo-commit0005b941c6c50bd9e5659f788d2538286a8e189d
repos{u'k8s.io/kubernetes': u'master:34791349d656a9f8e45b7093012e29ad08782ffa,80640:c74c008c6fd55c87769e005ea4a288cc52fe5958'}

Test Failures


k8s.io/kubernetes/test/integration/master TestEmptyList 3.55s

go test -v k8s.io/kubernetes/test/integration/master -run TestEmptyList$
=== RUN   TestEmptyList
I0814 13:24:32.152219  108836 services.go:33] Network range for service cluster IPs is unspecified. Defaulting to {10.0.0.0 ffffff00}.
I0814 13:24:32.152259  108836 services.go:45] Setting service IP to "10.0.0.1" (read-write).
I0814 13:24:32.152270  108836 master.go:278] Node port range unspecified. Defaulting to 30000-32767.
I0814 13:24:32.152280  108836 master.go:234] Using reconciler: 
I0814 13:24:32.153798  108836 storage_factory.go:285] storing podtemplates in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.153891  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.153904  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.153931  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.154030  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.154446  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.154542  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.154591  108836 store.go:1342] Monitoring podtemplates count at <storage-prefix>//podtemplates
I0814 13:24:32.154626  108836 storage_factory.go:285] storing events in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.154829  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.154881  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.154921  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.154975  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.155151  108836 reflector.go:160] Listing and watching *core.PodTemplate from storage/cacher.go:/podtemplates
I0814 13:24:32.155249  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.155326  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.155430  108836 store.go:1342] Monitoring events count at <storage-prefix>//events
I0814 13:24:32.155468  108836 storage_factory.go:285] storing limitranges in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.155515  108836 reflector.go:160] Listing and watching *core.Event from storage/cacher.go:/events
I0814 13:24:32.155538  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.155551  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.155580  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.155705  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.156023  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.156113  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.156139  108836 store.go:1342] Monitoring limitranges count at <storage-prefix>//limitranges
I0814 13:24:32.156166  108836 storage_factory.go:285] storing resourcequotas in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.156206  108836 reflector.go:160] Listing and watching *core.LimitRange from storage/cacher.go:/limitranges
I0814 13:24:32.156251  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.156267  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.156295  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.156340  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.156681  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.156809  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.157241  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.157505  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.157608  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.157903  108836 store.go:1342] Monitoring resourcequotas count at <storage-prefix>//resourcequotas
I0814 13:24:32.157993  108836 reflector.go:160] Listing and watching *core.ResourceQuota from storage/cacher.go:/resourcequotas
I0814 13:24:32.158306  108836 storage_factory.go:285] storing secrets in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.158415  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.158428  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.158451  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.158518  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.158763  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.158933  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.158829  108836 store.go:1342] Monitoring secrets count at <storage-prefix>//secrets
I0814 13:24:32.159111  108836 reflector.go:160] Listing and watching *core.Secret from storage/cacher.go:/secrets
I0814 13:24:32.160059  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.160136  108836 storage_factory.go:285] storing persistentvolumes in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.160217  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.160232  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.160260  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.160315  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.160584  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.160609  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.160724  108836 store.go:1342] Monitoring persistentvolumes count at <storage-prefix>//persistentvolumes
I0814 13:24:32.160823  108836 reflector.go:160] Listing and watching *core.PersistentVolume from storage/cacher.go:/persistentvolumes
I0814 13:24:32.160902  108836 storage_factory.go:285] storing persistentvolumeclaims in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.160958  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.160974  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.161001  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.161032  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.161581  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.162509  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.162664  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.162962  108836 store.go:1342] Monitoring persistentvolumeclaims count at <storage-prefix>//persistentvolumeclaims
I0814 13:24:32.163139  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.162985  108836 reflector.go:160] Listing and watching *core.PersistentVolumeClaim from storage/cacher.go:/persistentvolumeclaims
I0814 13:24:32.163983  108836 storage_factory.go:285] storing configmaps in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.164027  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.164212  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.164260  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.164287  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.164319  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.164494  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.164570  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.164577  108836 store.go:1342] Monitoring configmaps count at <storage-prefix>//configmaps
I0814 13:24:32.164661  108836 storage_factory.go:285] storing namespaces in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.164697  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.164703  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.164718  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.164765  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.164806  108836 reflector.go:160] Listing and watching *core.ConfigMap from storage/cacher.go:/configmaps
I0814 13:24:32.164936  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.164992  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.165052  108836 store.go:1342] Monitoring namespaces count at <storage-prefix>//namespaces
I0814 13:24:32.165165  108836 storage_factory.go:285] storing endpoints in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.165212  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.165218  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.165241  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.165261  108836 reflector.go:160] Listing and watching *core.Namespace from storage/cacher.go:/namespaces
I0814 13:24:32.165384  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.165531  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.165608  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.165663  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.165788  108836 store.go:1342] Monitoring endpoints count at <storage-prefix>//services/endpoints
I0814 13:24:32.165885  108836 storage_factory.go:285] storing nodes in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.165929  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.165935  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.165953  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.165984  108836 reflector.go:160] Listing and watching *core.Endpoints from storage/cacher.go:/services/endpoints
I0814 13:24:32.166163  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.166395  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.166513  108836 store.go:1342] Monitoring nodes count at <storage-prefix>//minions
I0814 13:24:32.166627  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.166672  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.166678  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.166679  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.166698  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.166724  108836 reflector.go:160] Listing and watching *core.Node from storage/cacher.go:/minions
I0814 13:24:32.166801  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.167022  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.167179  108836 store.go:1342] Monitoring pods count at <storage-prefix>//pods
I0814 13:24:32.167318  108836 storage_factory.go:285] storing serviceaccounts in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.167385  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.167395  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.167422  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.167478  108836 reflector.go:160] Listing and watching *core.Pod from storage/cacher.go:/pods
I0814 13:24:32.167551  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.167575  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.167824  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.167938  108836 store.go:1342] Monitoring serviceaccounts count at <storage-prefix>//serviceaccounts
I0814 13:24:32.168040  108836 storage_factory.go:285] storing services in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.168091  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.168100  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.168126  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.168165  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.168197  108836 reflector.go:160] Listing and watching *core.ServiceAccount from storage/cacher.go:/serviceaccounts
I0814 13:24:32.168374  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.168551  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.168579  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.168587  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.168651  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.168758  108836 store.go:1342] Monitoring services count at <storage-prefix>//services/specs
I0814 13:24:32.168780  108836 storage_factory.go:285] storing services in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.168886  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.168895  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.168919  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.168970  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.169209  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.169289  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.169298  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.169322  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.169381  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.169416  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.169438  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.169806  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.169924  108836 reflector.go:160] Listing and watching *core.Service from storage/cacher.go:/services/specs
I0814 13:24:32.169988  108836 watch_cache.go:405] Replace watchCache (rev: 36375) 
I0814 13:24:32.170092  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.171119  108836 watch_cache.go:405] Replace watchCache (rev: 36376) 
I0814 13:24:32.172091  108836 storage_factory.go:285] storing replicationcontrollers in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.172257  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.172266  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.172295  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.172333  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.174384  108836 watch_cache.go:405] Replace watchCache (rev: 36377) 
I0814 13:24:32.177196  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.177252  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.177359  108836 store.go:1342] Monitoring replicationcontrollers count at <storage-prefix>//controllers
I0814 13:24:32.177387  108836 reflector.go:160] Listing and watching *core.ReplicationController from storage/cacher.go:/controllers
I0814 13:24:32.178279  108836 watch_cache.go:405] Replace watchCache (rev: 36380) 
I0814 13:24:32.178709  108836 storage_factory.go:285] storing bindings in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.178925  108836 storage_factory.go:285] storing componentstatuses in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.179444  108836 storage_factory.go:285] storing configmaps in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.180423  108836 storage_factory.go:285] storing endpoints in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.181090  108836 storage_factory.go:285] storing events in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.181683  108836 storage_factory.go:285] storing limitranges in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.182070  108836 storage_factory.go:285] storing namespaces in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.182206  108836 storage_factory.go:285] storing namespaces in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.182402  108836 storage_factory.go:285] storing namespaces in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.182891  108836 storage_factory.go:285] storing nodes in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.183430  108836 storage_factory.go:285] storing nodes in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.183636  108836 storage_factory.go:285] storing nodes in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.184323  108836 storage_factory.go:285] storing persistentvolumeclaims in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.184600  108836 storage_factory.go:285] storing persistentvolumeclaims in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.185254  108836 storage_factory.go:285] storing persistentvolumes in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.185449  108836 storage_factory.go:285] storing persistentvolumes in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.186950  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.187253  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.187503  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.188006  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.188440  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.188574  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.188740  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.189630  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.189857  108836 storage_factory.go:285] storing pods in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.191080  108836 storage_factory.go:285] storing podtemplates in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.191793  108836 storage_factory.go:285] storing replicationcontrollers in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.192371  108836 storage_factory.go:285] storing replicationcontrollers in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.193192  108836 storage_factory.go:285] storing replicationcontrollers in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.193977  108836 storage_factory.go:285] storing resourcequotas in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.194568  108836 storage_factory.go:285] storing resourcequotas in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.195316  108836 storage_factory.go:285] storing secrets in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.196097  108836 storage_factory.go:285] storing serviceaccounts in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.196710  108836 storage_factory.go:285] storing services in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.197329  108836 storage_factory.go:285] storing services in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.197516  108836 storage_factory.go:285] storing services in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.198408  108836 master.go:423] Skipping disabled API group "auditregistration.k8s.io".
I0814 13:24:32.199096  108836 master.go:434] Enabling API group "authentication.k8s.io".
I0814 13:24:32.199256  108836 master.go:434] Enabling API group "authorization.k8s.io".
I0814 13:24:32.199535  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.199747  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.199880  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.199990  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.200126  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.200895  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.201118  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.201426  108836 store.go:1342] Monitoring horizontalpodautoscalers.autoscaling count at <storage-prefix>//horizontalpodautoscalers
I0814 13:24:32.201541  108836 reflector.go:160] Listing and watching *autoscaling.HorizontalPodAutoscaler from storage/cacher.go:/horizontalpodautoscalers
I0814 13:24:32.201762  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.202452  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.202561  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.202700  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.202822  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.203077  108836 watch_cache.go:405] Replace watchCache (rev: 36391) 
I0814 13:24:32.203772  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.203965  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.204580  108836 store.go:1342] Monitoring horizontalpodautoscalers.autoscaling count at <storage-prefix>//horizontalpodautoscalers
I0814 13:24:32.204672  108836 reflector.go:160] Listing and watching *autoscaling.HorizontalPodAutoscaler from storage/cacher.go:/horizontalpodautoscalers
I0814 13:24:32.204925  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.205236  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.205441  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.205631  108836 watch_cache.go:405] Replace watchCache (rev: 36393) 
I0814 13:24:32.205707  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.205846  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.206213  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.206367  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.206581  108836 store.go:1342] Monitoring horizontalpodautoscalers.autoscaling count at <storage-prefix>//horizontalpodautoscalers
I0814 13:24:32.206797  108836 master.go:434] Enabling API group "autoscaling".
I0814 13:24:32.206754  108836 reflector.go:160] Listing and watching *autoscaling.HorizontalPodAutoscaler from storage/cacher.go:/horizontalpodautoscalers
I0814 13:24:32.207478  108836 storage_factory.go:285] storing jobs.batch in batch/v1, reading as batch/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.207890  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.208081  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.208052  108836 watch_cache.go:405] Replace watchCache (rev: 36394) 
I0814 13:24:32.208226  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.208367  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.212293  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.212346  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.212571  108836 store.go:1342] Monitoring jobs.batch count at <storage-prefix>//jobs
I0814 13:24:32.212680  108836 reflector.go:160] Listing and watching *batch.Job from storage/cacher.go:/jobs
I0814 13:24:32.212855  108836 storage_factory.go:285] storing cronjobs.batch in batch/v1beta1, reading as batch/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.213045  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.213105  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.213141  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.213244  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.213906  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.213997  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.214074  108836 watch_cache.go:405] Replace watchCache (rev: 36397) 
I0814 13:24:32.214088  108836 store.go:1342] Monitoring cronjobs.batch count at <storage-prefix>//cronjobs
I0814 13:24:32.214105  108836 master.go:434] Enabling API group "batch".
I0814 13:24:32.214131  108836 reflector.go:160] Listing and watching *batch.CronJob from storage/cacher.go:/cronjobs
I0814 13:24:32.214273  108836 storage_factory.go:285] storing certificatesigningrequests.certificates.k8s.io in certificates.k8s.io/v1beta1, reading as certificates.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.215237  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.216444  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.215367  108836 watch_cache.go:405] Replace watchCache (rev: 36397) 
I0814 13:24:32.216517  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.216687  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.217006  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.217112  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.217116  108836 store.go:1342] Monitoring certificatesigningrequests.certificates.k8s.io count at <storage-prefix>//certificatesigningrequests
I0814 13:24:32.217160  108836 master.go:434] Enabling API group "certificates.k8s.io".
I0814 13:24:32.217206  108836 reflector.go:160] Listing and watching *certificates.CertificateSigningRequest from storage/cacher.go:/certificatesigningrequests
I0814 13:24:32.217311  108836 storage_factory.go:285] storing leases.coordination.k8s.io in coordination.k8s.io/v1beta1, reading as coordination.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.217391  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.217400  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.217449  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.217504  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.217746  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.217866  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.217889  108836 store.go:1342] Monitoring leases.coordination.k8s.io count at <storage-prefix>//leases
I0814 13:24:32.217919  108836 reflector.go:160] Listing and watching *coordination.Lease from storage/cacher.go:/leases
I0814 13:24:32.218082  108836 storage_factory.go:285] storing leases.coordination.k8s.io in coordination.k8s.io/v1beta1, reading as coordination.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.218167  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.218177  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.218245  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.218299  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.219371  108836 watch_cache.go:405] Replace watchCache (rev: 36400) 
I0814 13:24:32.219409  108836 watch_cache.go:405] Replace watchCache (rev: 36399) 
I0814 13:24:32.219963  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.219993  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.220134  108836 store.go:1342] Monitoring leases.coordination.k8s.io count at <storage-prefix>//leases
I0814 13:24:32.220167  108836 master.go:434] Enabling API group "coordination.k8s.io".
I0814 13:24:32.220171  108836 reflector.go:160] Listing and watching *coordination.Lease from storage/cacher.go:/leases
I0814 13:24:32.220473  108836 storage_factory.go:285] storing ingresses.networking.k8s.io in networking.k8s.io/v1beta1, reading as networking.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.220669  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.220761  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.220878  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.220986  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.221293  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.221361  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.221392  108836 store.go:1342] Monitoring ingresses.networking.k8s.io count at <storage-prefix>//ingress
I0814 13:24:32.221415  108836 master.go:434] Enabling API group "extensions".
I0814 13:24:32.221457  108836 reflector.go:160] Listing and watching *networking.Ingress from storage/cacher.go:/ingress
I0814 13:24:32.221578  108836 storage_factory.go:285] storing networkpolicies.networking.k8s.io in networking.k8s.io/v1, reading as networking.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.221637  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.221646  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.221675  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.221714  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.222139  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.222335  108836 store.go:1342] Monitoring networkpolicies.networking.k8s.io count at <storage-prefix>//networkpolicies
I0814 13:24:32.222353  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.222375  108836 reflector.go:160] Listing and watching *networking.NetworkPolicy from storage/cacher.go:/networkpolicies
I0814 13:24:32.222516  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.222518  108836 storage_factory.go:285] storing ingresses.networking.k8s.io in networking.k8s.io/v1beta1, reading as networking.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.222583  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.222593  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.222621  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.222669  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.223115  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.223182  108836 store.go:1342] Monitoring ingresses.networking.k8s.io count at <storage-prefix>//ingress
I0814 13:24:32.223191  108836 master.go:434] Enabling API group "networking.k8s.io".
I0814 13:24:32.223217  108836 storage_factory.go:285] storing runtimeclasses.node.k8s.io in node.k8s.io/v1beta1, reading as node.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.223276  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.223283  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.223319  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.223320  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.223363  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.223423  108836 reflector.go:160] Listing and watching *networking.Ingress from storage/cacher.go:/ingress
I0814 13:24:32.223572  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.223661  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.223671  108836 store.go:1342] Monitoring runtimeclasses.node.k8s.io count at <storage-prefix>//runtimeclasses
I0814 13:24:32.223685  108836 master.go:434] Enabling API group "node.k8s.io".
I0814 13:24:32.223694  108836 reflector.go:160] Listing and watching *node.RuntimeClass from storage/cacher.go:/runtimeclasses
I0814 13:24:32.223797  108836 storage_factory.go:285] storing poddisruptionbudgets.policy in policy/v1beta1, reading as policy/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.223908  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.223917  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.223945  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.224059  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.224314  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.224319  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.224362  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.224492  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.224578  108836 store.go:1342] Monitoring poddisruptionbudgets.policy count at <storage-prefix>//poddisruptionbudgets
I0814 13:24:32.224620  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.224629  108836 reflector.go:160] Listing and watching *policy.PodDisruptionBudget from storage/cacher.go:/poddisruptionbudgets
I0814 13:24:32.224704  108836 storage_factory.go:285] storing podsecuritypolicies.policy in policy/v1beta1, reading as policy/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.224766  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.224774  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.224798  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.224859  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.225112  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.225232  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.225240  108836 store.go:1342] Monitoring podsecuritypolicies.policy count at <storage-prefix>//podsecuritypolicy
I0814 13:24:32.225255  108836 master.go:434] Enabling API group "policy".
I0814 13:24:32.225283  108836 storage_factory.go:285] storing roles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.225362  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.225401  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.225425  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.225430  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.225462  108836 reflector.go:160] Listing and watching *policy.PodSecurityPolicy from storage/cacher.go:/podsecuritypolicy
I0814 13:24:32.225563  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.225948  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.225983  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.226048  108836 store.go:1342] Monitoring roles.rbac.authorization.k8s.io count at <storage-prefix>//roles
I0814 13:24:32.226112  108836 reflector.go:160] Listing and watching *rbac.Role from storage/cacher.go:/roles
I0814 13:24:32.226171  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.226175  108836 storage_factory.go:285] storing rolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.226232  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.226245  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.226271  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.226303  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.226539  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.226640  108836 store.go:1342] Monitoring rolebindings.rbac.authorization.k8s.io count at <storage-prefix>//rolebindings
I0814 13:24:32.226670  108836 storage_factory.go:285] storing clusterroles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.226730  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.226736  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.226739  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.226782  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.226781  108836 reflector.go:160] Listing and watching *rbac.RoleBinding from storage/cacher.go:/rolebindings
I0814 13:24:32.226808  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.226882  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.227223  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.227331  108836 store.go:1342] Monitoring clusterroles.rbac.authorization.k8s.io count at <storage-prefix>//clusterroles
I0814 13:24:32.227457  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.227487  108836 storage_factory.go:285] storing clusterrolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.227508  108836 reflector.go:160] Listing and watching *rbac.ClusterRole from storage/cacher.go:/clusterroles
I0814 13:24:32.227584  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.227595  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.227624  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.227662  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.228120  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.228197  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.228206  108836 store.go:1342] Monitoring clusterrolebindings.rbac.authorization.k8s.io count at <storage-prefix>//clusterrolebindings
I0814 13:24:32.228222  108836 reflector.go:160] Listing and watching *rbac.ClusterRoleBinding from storage/cacher.go:/clusterrolebindings
I0814 13:24:32.228245  108836 storage_factory.go:285] storing roles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.228276  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.228319  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.228327  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.228352  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.228443  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.228766  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.228821  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.228916  108836 store.go:1342] Monitoring roles.rbac.authorization.k8s.io count at <storage-prefix>//roles
I0814 13:24:32.229000  108836 reflector.go:160] Listing and watching *rbac.Role from storage/cacher.go:/roles
I0814 13:24:32.229034  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.229123  108836 storage_factory.go:285] storing rolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.229236  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.229248  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.229303  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.229344  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.229520  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.229673  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.229732  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.229780  108836 store.go:1342] Monitoring rolebindings.rbac.authorization.k8s.io count at <storage-prefix>//rolebindings
I0814 13:24:32.229858  108836 reflector.go:160] Listing and watching *rbac.RoleBinding from storage/cacher.go:/rolebindings
I0814 13:24:32.229829  108836 storage_factory.go:285] storing clusterroles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.229960  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.229972  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.230029  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.230072  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.230597  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.230623  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.230735  108836 store.go:1342] Monitoring clusterroles.rbac.authorization.k8s.io count at <storage-prefix>//clusterroles
I0814 13:24:32.230869  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.230871  108836 storage_factory.go:285] storing clusterrolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.230947  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.230956  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.230958  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.230986  108836 reflector.go:160] Listing and watching *rbac.ClusterRole from storage/cacher.go:/clusterroles
I0814 13:24:32.230990  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.231143  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.231359  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.231451  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.231487  108836 store.go:1342] Monitoring clusterrolebindings.rbac.authorization.k8s.io count at <storage-prefix>//clusterrolebindings
I0814 13:24:32.231508  108836 master.go:434] Enabling API group "rbac.authorization.k8s.io".
I0814 13:24:32.231509  108836 reflector.go:160] Listing and watching *rbac.ClusterRoleBinding from storage/cacher.go:/clusterrolebindings
I0814 13:24:32.231782  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.232566  108836 watch_cache.go:405] Replace watchCache (rev: 36401) 
I0814 13:24:32.233494  108836 storage_factory.go:285] storing priorityclasses.scheduling.k8s.io in scheduling.k8s.io/v1, reading as scheduling.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.233572  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.233582  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.233615  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.233661  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.233891  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.233994  108836 store.go:1342] Monitoring priorityclasses.scheduling.k8s.io count at <storage-prefix>//priorityclasses
I0814 13:24:32.234076  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.234122  108836 reflector.go:160] Listing and watching *scheduling.PriorityClass from storage/cacher.go:/priorityclasses
I0814 13:24:32.234138  108836 storage_factory.go:285] storing priorityclasses.scheduling.k8s.io in scheduling.k8s.io/v1, reading as scheduling.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.234202  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.234212  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.234248  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.234286  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.234596  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.234693  108836 store.go:1342] Monitoring priorityclasses.scheduling.k8s.io count at <storage-prefix>//priorityclasses
I0814 13:24:32.234706  108836 master.go:434] Enabling API group "scheduling.k8s.io".
I0814 13:24:32.234715  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.234764  108836 reflector.go:160] Listing and watching *scheduling.PriorityClass from storage/cacher.go:/priorityclasses
I0814 13:24:32.234790  108836 master.go:423] Skipping disabled API group "settings.k8s.io".
I0814 13:24:32.235004  108836 storage_factory.go:285] storing storageclasses.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.235093  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.235103  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.235155  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.235210  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.235434  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.235531  108836 store.go:1342] Monitoring storageclasses.storage.k8s.io count at <storage-prefix>//storageclasses
I0814 13:24:32.235658  108836 storage_factory.go:285] storing volumeattachments.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.235711  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.235718  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.235747  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.235754  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.235772  108836 reflector.go:160] Listing and watching *storage.StorageClass from storage/cacher.go:/storageclasses
I0814 13:24:32.235898  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.236862  108836 watch_cache.go:405] Replace watchCache (rev: 36404) 
I0814 13:24:32.236865  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.236919  108836 watch_cache.go:405] Replace watchCache (rev: 36404) 
I0814 13:24:32.236949  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.236956  108836 watch_cache.go:405] Replace watchCache (rev: 36403) 
I0814 13:24:32.237020  108836 store.go:1342] Monitoring volumeattachments.storage.k8s.io count at <storage-prefix>//volumeattachments
I0814 13:24:32.237039  108836 storage_factory.go:285] storing csinodes.storage.k8s.io in storage.k8s.io/v1beta1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.237077  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.237082  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.237084  108836 reflector.go:160] Listing and watching *storage.VolumeAttachment from storage/cacher.go:/volumeattachments
I0814 13:24:32.237107  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.237151  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.237579  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.237667  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.237826  108836 reflector.go:160] Listing and watching *storage.CSINode from storage/cacher.go:/csinodes
I0814 13:24:32.237951  108836 store.go:1342] Monitoring csinodes.storage.k8s.io count at <storage-prefix>//csinodes
I0814 13:24:32.238043  108836 storage_factory.go:285] storing csidrivers.storage.k8s.io in storage.k8s.io/v1beta1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.238118  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.238129  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.238262  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.238325  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.238724  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.238916  108836 store.go:1342] Monitoring csidrivers.storage.k8s.io count at <storage-prefix>//csidrivers
I0814 13:24:32.238943  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.238961  108836 reflector.go:160] Listing and watching *storage.CSIDriver from storage/cacher.go:/csidrivers
I0814 13:24:32.239064  108836 storage_factory.go:285] storing storageclasses.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.239131  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.239141  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.239262  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.239331  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.240469  108836 watch_cache.go:405] Replace watchCache (rev: 36404) 
I0814 13:24:32.240470  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.240520  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.240518  108836 watch_cache.go:405] Replace watchCache (rev: 36404) 
I0814 13:24:32.240614  108836 store.go:1342] Monitoring storageclasses.storage.k8s.io count at <storage-prefix>//storageclasses
I0814 13:24:32.240655  108836 reflector.go:160] Listing and watching *storage.StorageClass from storage/cacher.go:/storageclasses
I0814 13:24:32.240721  108836 storage_factory.go:285] storing volumeattachments.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.240765  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.240771  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.240789  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.240821  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.241660  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.241746  108836 store.go:1342] Monitoring volumeattachments.storage.k8s.io count at <storage-prefix>//volumeattachments
I0814 13:24:32.241763  108836 master.go:434] Enabling API group "storage.k8s.io".
I0814 13:24:32.241879  108836 storage_factory.go:285] storing deployments.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.241941  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.241965  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.241995  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.242044  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.242075  108836 reflector.go:160] Listing and watching *storage.VolumeAttachment from storage/cacher.go:/volumeattachments
I0814 13:24:32.242105  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.242960  108836 watch_cache.go:405] Replace watchCache (rev: 36405) 
I0814 13:24:32.243056  108836 watch_cache.go:405] Replace watchCache (rev: 36405) 
I0814 13:24:32.243125  108836 watch_cache.go:405] Replace watchCache (rev: 36405) 
I0814 13:24:32.243163  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.243318  108836 store.go:1342] Monitoring deployments.apps count at <storage-prefix>//deployments
I0814 13:24:32.243332  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.243350  108836 reflector.go:160] Listing and watching *apps.Deployment from storage/cacher.go:/deployments
I0814 13:24:32.243450  108836 storage_factory.go:285] storing statefulsets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.243535  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.243549  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.243584  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.243645  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.243758  108836 watch_cache.go:405] Replace watchCache (rev: 36405) 
I0814 13:24:32.243974  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.244074  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.244100  108836 store.go:1342] Monitoring statefulsets.apps count at <storage-prefix>//statefulsets
I0814 13:24:32.244176  108836 reflector.go:160] Listing and watching *apps.StatefulSet from storage/cacher.go:/statefulsets
I0814 13:24:32.244279  108836 storage_factory.go:285] storing daemonsets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.244366  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.244382  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.244415  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.244553  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.245972  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.246013  108836 watch_cache.go:405] Replace watchCache (rev: 36405) 
I0814 13:24:32.246032  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.246013  108836 watch_cache.go:405] Replace watchCache (rev: 36405) 
I0814 13:24:32.246132  108836 store.go:1342] Monitoring daemonsets.apps count at <storage-prefix>//daemonsets
I0814 13:24:32.246297  108836 reflector.go:160] Listing and watching *apps.DaemonSet from storage/cacher.go:/daemonsets
I0814 13:24:32.246302  108836 storage_factory.go:285] storing replicasets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.246381  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.246390  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.246414  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.246451  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.246712  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.246753  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.246826  108836 store.go:1342] Monitoring replicasets.apps count at <storage-prefix>//replicasets
I0814 13:24:32.246874  108836 reflector.go:160] Listing and watching *apps.ReplicaSet from storage/cacher.go:/replicasets
I0814 13:24:32.246984  108836 storage_factory.go:285] storing controllerrevisions.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.247040  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.247047  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.247069  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.247163  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.247501  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.247534  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.247598  108836 store.go:1342] Monitoring controllerrevisions.apps count at <storage-prefix>//controllerrevisions
I0814 13:24:32.247607  108836 master.go:434] Enabling API group "apps".
I0814 13:24:32.247645  108836 storage_factory.go:285] storing validatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.247660  108836 reflector.go:160] Listing and watching *apps.ControllerRevision from storage/cacher.go:/controllerrevisions
I0814 13:24:32.247689  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.247696  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.247728  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.247797  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.247969  108836 watch_cache.go:405] Replace watchCache (rev: 36407) 
I0814 13:24:32.247989  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.248080  108836 store.go:1342] Monitoring validatingwebhookconfigurations.admissionregistration.k8s.io count at <storage-prefix>//validatingwebhookconfigurations
I0814 13:24:32.248159  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.248234  108836 reflector.go:160] Listing and watching *admissionregistration.ValidatingWebhookConfiguration from storage/cacher.go:/validatingwebhookconfigurations
I0814 13:24:32.248102  108836 storage_factory.go:285] storing mutatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.249229  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.249238  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.249271  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.249316  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.249512  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.249598  108836 store.go:1342] Monitoring mutatingwebhookconfigurations.admissionregistration.k8s.io count at <storage-prefix>//mutatingwebhookconfigurations
I0814 13:24:32.249627  108836 storage_factory.go:285] storing validatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.249673  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.249679  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.249699  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.249738  108836 reflector.go:160] Listing and watching *admissionregistration.MutatingWebhookConfiguration from storage/cacher.go:/mutatingwebhookconfigurations
I0814 13:24:32.249760  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.249894  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.251049  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.251159  108836 store.go:1342] Monitoring validatingwebhookconfigurations.admissionregistration.k8s.io count at <storage-prefix>//validatingwebhookconfigurations
I0814 13:24:32.251177  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.251192  108836 storage_factory.go:285] storing mutatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.251218  108836 reflector.go:160] Listing and watching *admissionregistration.ValidatingWebhookConfiguration from storage/cacher.go:/validatingwebhookconfigurations
I0814 13:24:32.251273  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.251290  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.251321  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.251454  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.251760  108836 watch_cache.go:405] Replace watchCache (rev: 36408) 
I0814 13:24:32.251997  108836 watch_cache.go:405] Replace watchCache (rev: 36408) 
I0814 13:24:32.252033  108836 watch_cache.go:405] Replace watchCache (rev: 36408) 
I0814 13:24:32.252047  108836 watch_cache.go:405] Replace watchCache (rev: 36408) 
I0814 13:24:32.252269  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.252294  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.252374  108836 store.go:1342] Monitoring mutatingwebhookconfigurations.admissionregistration.k8s.io count at <storage-prefix>//mutatingwebhookconfigurations
I0814 13:24:32.252391  108836 master.go:434] Enabling API group "admissionregistration.k8s.io".
I0814 13:24:32.252416  108836 storage_factory.go:285] storing events in v1, reading as __internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.252547  108836 watch_cache.go:405] Replace watchCache (rev: 36408) 
I0814 13:24:32.252629  108836 client.go:354] parsed scheme: ""
I0814 13:24:32.252641  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:32.252677  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:32.252724  108836 reflector.go:160] Listing and watching *admissionregistration.MutatingWebhookConfiguration from storage/cacher.go:/mutatingwebhookconfigurations
I0814 13:24:32.252909  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.253192  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:32.253364  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:32.253416  108836 watch_cache.go:405] Replace watchCache (rev: 36408) 
I0814 13:24:32.253449  108836 store.go:1342] Monitoring events count at <storage-prefix>//events
I0814 13:24:32.253750  108836 master.go:434] Enabling API group "events.k8s.io".
I0814 13:24:32.253472  108836 reflector.go:160] Listing and watching *core.Event from storage/cacher.go:/events
I0814 13:24:32.254215  108836 storage_factory.go:285] storing tokenreviews.authentication.k8s.io in authentication.k8s.io/v1, reading as authentication.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.254400  108836 storage_factory.go:285] storing tokenreviews.authentication.k8s.io in authentication.k8s.io/v1, reading as authentication.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.254711  108836 storage_factory.go:285] storing localsubjectaccessreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.254878  108836 storage_factory.go:285] storing selfsubjectaccessreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.255003  108836 storage_factory.go:285] storing selfsubjectrulesreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.255103  108836 watch_cache.go:405] Replace watchCache (rev: 36408) 
I0814 13:24:32.255105  108836 storage_factory.go:285] storing subjectaccessreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.255470  108836 storage_factory.go:285] storing localsubjectaccessreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.255669  108836 storage_factory.go:285] storing selfsubjectaccessreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.255828  108836 storage_factory.go:285] storing selfsubjectrulesreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.255980  108836 storage_factory.go:285] storing subjectaccessreviews.authorization.k8s.io in authorization.k8s.io/v1, reading as authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.257017  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.257280  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.258106  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.258378  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.259405  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.259823  108836 storage_factory.go:285] storing horizontalpodautoscalers.autoscaling in autoscaling/v1, reading as autoscaling/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.260777  108836 storage_factory.go:285] storing jobs.batch in batch/v1, reading as batch/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.261054  108836 storage_factory.go:285] storing jobs.batch in batch/v1, reading as batch/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.262016  108836 storage_factory.go:285] storing cronjobs.batch in batch/v1beta1, reading as batch/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.262300  108836 storage_factory.go:285] storing cronjobs.batch in batch/v1beta1, reading as batch/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
W0814 13:24:32.262353  108836 genericapiserver.go:390] Skipping API batch/v2alpha1 because it has no resources.
I0814 13:24:32.263164  108836 storage_factory.go:285] storing certificatesigningrequests.certificates.k8s.io in certificates.k8s.io/v1beta1, reading as certificates.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.263314  108836 storage_factory.go:285] storing certificatesigningrequests.certificates.k8s.io in certificates.k8s.io/v1beta1, reading as certificates.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.263566  108836 storage_factory.go:285] storing certificatesigningrequests.certificates.k8s.io in certificates.k8s.io/v1beta1, reading as certificates.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.264514  108836 storage_factory.go:285] storing leases.coordination.k8s.io in coordination.k8s.io/v1beta1, reading as coordination.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.265197  108836 storage_factory.go:285] storing leases.coordination.k8s.io in coordination.k8s.io/v1beta1, reading as coordination.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.266305  108836 storage_factory.go:285] storing ingresses.extensions in extensions/v1beta1, reading as extensions/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.266715  108836 storage_factory.go:285] storing ingresses.extensions in extensions/v1beta1, reading as extensions/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.267705  108836 storage_factory.go:285] storing networkpolicies.networking.k8s.io in networking.k8s.io/v1, reading as networking.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.268443  108836 storage_factory.go:285] storing ingresses.networking.k8s.io in networking.k8s.io/v1beta1, reading as networking.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.268725  108836 storage_factory.go:285] storing ingresses.networking.k8s.io in networking.k8s.io/v1beta1, reading as networking.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.269352  108836 storage_factory.go:285] storing runtimeclasses.node.k8s.io in node.k8s.io/v1beta1, reading as node.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
W0814 13:24:32.269412  108836 genericapiserver.go:390] Skipping API node.k8s.io/v1alpha1 because it has no resources.
I0814 13:24:32.270187  108836 storage_factory.go:285] storing poddisruptionbudgets.policy in policy/v1beta1, reading as policy/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.270613  108836 storage_factory.go:285] storing poddisruptionbudgets.policy in policy/v1beta1, reading as policy/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.271128  108836 storage_factory.go:285] storing podsecuritypolicies.policy in policy/v1beta1, reading as policy/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.271943  108836 storage_factory.go:285] storing clusterrolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.272341  108836 storage_factory.go:285] storing clusterroles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.273171  108836 storage_factory.go:285] storing rolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.273758  108836 storage_factory.go:285] storing roles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.274400  108836 storage_factory.go:285] storing clusterrolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.274982  108836 storage_factory.go:285] storing clusterroles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.275468  108836 storage_factory.go:285] storing rolebindings.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.276251  108836 storage_factory.go:285] storing roles.rbac.authorization.k8s.io in rbac.authorization.k8s.io/v1, reading as rbac.authorization.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
W0814 13:24:32.276327  108836 genericapiserver.go:390] Skipping API rbac.authorization.k8s.io/v1alpha1 because it has no resources.
I0814 13:24:32.277262  108836 storage_factory.go:285] storing priorityclasses.scheduling.k8s.io in scheduling.k8s.io/v1, reading as scheduling.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.278186  108836 storage_factory.go:285] storing priorityclasses.scheduling.k8s.io in scheduling.k8s.io/v1, reading as scheduling.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
W0814 13:24:32.278305  108836 genericapiserver.go:390] Skipping API scheduling.k8s.io/v1alpha1 because it has no resources.
I0814 13:24:32.279115  108836 storage_factory.go:285] storing storageclasses.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.279680  108836 storage_factory.go:285] storing volumeattachments.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.279904  108836 storage_factory.go:285] storing volumeattachments.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.280338  108836 storage_factory.go:285] storing csidrivers.storage.k8s.io in storage.k8s.io/v1beta1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.280646  108836 storage_factory.go:285] storing csinodes.storage.k8s.io in storage.k8s.io/v1beta1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.281257  108836 storage_factory.go:285] storing storageclasses.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.281966  108836 storage_factory.go:285] storing volumeattachments.storage.k8s.io in storage.k8s.io/v1, reading as storage.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
W0814 13:24:32.282073  108836 genericapiserver.go:390] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
I0814 13:24:32.282702  108836 storage_factory.go:285] storing controllerrevisions.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.283340  108836 storage_factory.go:285] storing daemonsets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.283744  108836 storage_factory.go:285] storing daemonsets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.284465  108836 storage_factory.go:285] storing deployments.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.284644  108836 storage_factory.go:285] storing deployments.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.284810  108836 storage_factory.go:285] storing deployments.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.285407  108836 storage_factory.go:285] storing replicasets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.285635  108836 storage_factory.go:285] storing replicasets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.286014  108836 storage_factory.go:285] storing replicasets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.286724  108836 storage_factory.go:285] storing statefulsets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.287073  108836 storage_factory.go:285] storing statefulsets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.287308  108836 storage_factory.go:285] storing statefulsets.apps in apps/v1, reading as apps/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
W0814 13:24:32.287420  108836 genericapiserver.go:390] Skipping API apps/v1beta2 because it has no resources.
W0814 13:24:32.287429  108836 genericapiserver.go:390] Skipping API apps/v1beta1 because it has no resources.
I0814 13:24:32.288089  108836 storage_factory.go:285] storing mutatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.288578  108836 storage_factory.go:285] storing validatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.289117  108836 storage_factory.go:285] storing mutatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.289759  108836 storage_factory.go:285] storing validatingwebhookconfigurations.admissionregistration.k8s.io in admissionregistration.k8s.io/v1beta1, reading as admissionregistration.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.290353  108836 storage_factory.go:285] storing events.events.k8s.io in events.k8s.io/v1beta1, reading as events.k8s.io/__internal from storagebackend.Config{Type:"", Prefix:"34b65b7e-1b5e-4a40-a595-4576bacb63f4", Transport:storagebackend.TransportConfig{ServerList:[]string{"http://127.0.0.1:2379"}, KeyFile:"", CertFile:"", CAFile:""}, Paging:true, Codec:runtime.Codec(nil), EncodeVersioner:runtime.GroupVersioner(nil), Transformer:value.Transformer(nil), CompactionInterval:300000000000, CountMetricPollPeriod:60000000000}
I0814 13:24:32.292174  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.292200  108836 healthz.go:169] healthz check poststarthook/bootstrap-controller failed: not finished
I0814 13:24:32.292210  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.292223  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.292232  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.292238  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[-]poststarthook/bootstrap-controller failed: reason withheld
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.292278  108836 httplog.go:90] GET /healthz: (250.593µs) 0 [Go-http-client/1.1 127.0.0.1:48838]
I0814 13:24:32.292969  108836 httplog.go:90] GET /api/v1/namespaces/default/endpoints/kubernetes: (890.012µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:32.294929  108836 httplog.go:90] GET /api/v1/services: (976.046µs) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:32.297903  108836 httplog.go:90] GET /api/v1/services: (774.238µs) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:32.299753  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.299774  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.299786  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.299795  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.299803  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.299828  108836 httplog.go:90] GET /healthz: (176.99µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:32.300706  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (983.927µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48838]
I0814 13:24:32.301233  108836 httplog.go:90] GET /api/v1/services: (797.878µs) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:32.302401  108836 httplog.go:90] GET /api/v1/services: (938.797µs) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.304190  108836 httplog.go:90] POST /api/v1/namespaces: (3.082668ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48838]
I0814 13:24:32.305243  108836 httplog.go:90] GET /api/v1/namespaces/kube-public: (732.944µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.306545  108836 httplog.go:90] POST /api/v1/namespaces: (980.657µs) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.307870  108836 httplog.go:90] GET /api/v1/namespaces/kube-node-lease: (887.339µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.309528  108836 httplog.go:90] POST /api/v1/namespaces: (1.284453ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.392973  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.393006  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.393019  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.393027  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.393034  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.393062  108836 httplog.go:90] GET /healthz: (202.185µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:32.400441  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.400480  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.400487  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.400493  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.400497  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.400517  108836 httplog.go:90] GET /healthz: (175.895µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.492973  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.493005  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.493013  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.493019  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.493024  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.493052  108836 httplog.go:90] GET /healthz: (189.672µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:32.500462  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.500495  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.500507  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.500527  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.500534  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.500569  108836 httplog.go:90] GET /healthz: (234.228µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.593165  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.593252  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.593266  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.593276  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.593293  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.593391  108836 httplog.go:90] GET /healthz: (356.452µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:32.600468  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.600495  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.600508  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.600522  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.600529  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.600583  108836 httplog.go:90] GET /healthz: (204.785µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.693107  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.693139  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.693150  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.693160  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.693168  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.693198  108836 httplog.go:90] GET /healthz: (245.508µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:32.700429  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.700462  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.700473  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.700499  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.700506  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.700543  108836 httplog.go:90] GET /healthz: (202.507µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.793562  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.793594  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.793603  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.793609  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.793615  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.793645  108836 httplog.go:90] GET /healthz: (258.965µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:32.800382  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.800403  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.800411  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.800418  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.800423  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.800461  108836 httplog.go:90] GET /healthz: (176.057µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.893040  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.893072  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.893081  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.893101  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.893107  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.893129  108836 httplog.go:90] GET /healthz: (204.991µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:32.900389  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.900414  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.900422  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.900428  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.900433  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.900459  108836 httplog.go:90] GET /healthz: (163.158µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:32.993033  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:32.993070  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:32.993082  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:32.993092  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:32.993100  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:32.993130  108836 httplog.go:90] GET /healthz: (225.503µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:33.000439  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:33.000466  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.000478  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:33.000487  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:33.000495  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:33.000519  108836 httplog.go:90] GET /healthz: (197.795µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:33.093019  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:33.093048  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.093056  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:33.093063  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:33.093069  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:33.093092  108836 httplog.go:90] GET /healthz: (198.284µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:33.100341  108836 healthz.go:169] healthz check etcd failed: etcd client connection not yet established
I0814 13:24:33.100377  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.100388  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:33.100394  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:33.100400  108836 healthz.go:183] [+]ping ok
[+]log ok
[-]etcd failed: reason withheld
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:33.100418  108836 httplog.go:90] GET /healthz: (168.859µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:33.151989  108836 client.go:354] parsed scheme: ""
I0814 13:24:33.152017  108836 client.go:354] scheme "" not registered, fallback to default scheme
I0814 13:24:33.152068  108836 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0814 13:24:33.152147  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:33.152567  108836 balancer_conn_wrappers.go:131] clientv3/balancer: pin "127.0.0.1:2379"
I0814 13:24:33.152629  108836 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0814 13:24:33.193597  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.193619  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:33.193625  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:33.193631  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:33.193703  108836 httplog.go:90] GET /healthz: (807.404µs) 0 [Go-http-client/1.1 127.0.0.1:48842]
I0814 13:24:33.200965  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.200992  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:33.201019  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:33.201027  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:33.201080  108836 httplog.go:90] GET /healthz: (751.785µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:33.293219  108836 httplog.go:90] GET /apis/scheduling.k8s.io/v1beta1/priorityclasses/system-node-critical: (1.154407ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:33.293234  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.174534ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:33.293451  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.014396ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49100]
I0814 13:24:33.294400  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.294424  108836 healthz.go:169] healthz check poststarthook/scheduling/bootstrap-system-priority-classes failed: not finished
I0814 13:24:33.294452  108836 healthz.go:169] healthz check poststarthook/ca-registration failed: not finished
I0814 13:24:33.294461  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
[-]poststarthook/ca-registration failed: reason withheld
healthz check failed
I0814 13:24:33.294507  108836 httplog.go:90] GET /healthz: (1.546374ms) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:33.294779  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.045255ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:33.294861  108836 httplog.go:90] GET /api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication: (1.119488ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49100]
I0814 13:24:33.294917  108836 httplog.go:90] POST /apis/scheduling.k8s.io/v1beta1/priorityclasses: (1.304985ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48842]
I0814 13:24:33.295078  108836 storage_scheduling.go:139] created PriorityClass system-node-critical with value 2000001000
I0814 13:24:33.295903  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:aggregate-to-edit: (819.138µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:33.296258  108836 httplog.go:90] GET /apis/scheduling.k8s.io/v1beta1/priorityclasses/system-cluster-critical: (911.665µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.296331  108836 httplog.go:90] POST /api/v1/namespaces/kube-system/configmaps: (1.166394ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.297364  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/edit: (842.453µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:33.297923  108836 httplog.go:90] POST /apis/scheduling.k8s.io/v1beta1/priorityclasses: (1.367404ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.298077  108836 storage_scheduling.go:139] created PriorityClass system-cluster-critical with value 2000000000
I0814 13:24:33.298091  108836 storage_scheduling.go:148] all system priority classes are created successfully or already exist.
I0814 13:24:33.298707  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:aggregate-to-view: (694.482µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:48840]
I0814 13:24:33.299707  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/view: (699.65µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.301003  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:aggregate-to-admin: (874.991µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.301724  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/admin: (489.694µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.301992  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.302020  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.302046  108836 httplog.go:90] GET /healthz: (1.2427ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.302651  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:discovery: (678.569µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.303527  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/cluster-admin: (606.478µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.305228  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.176378ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.305456  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/cluster-admin
I0814 13:24:33.306501  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:discovery: (852.899µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.308298  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.339133ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.308548  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:discovery
I0814 13:24:33.309540  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:basic-user: (656.177µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.310951  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (910.983µs) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.311201  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:basic-user
I0814 13:24:33.312470  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:public-info-viewer: (692.758µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.313976  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.076195ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.314142  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:public-info-viewer
I0814 13:24:33.314938  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/admin: (656.87µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.316758  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.437534ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.317563  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/admin
I0814 13:24:33.318443  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/edit: (736.694µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.321115  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (2.036155ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.322257  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/edit
I0814 13:24:33.323249  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/view: (792.557µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.324952  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.180054ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.325282  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/view
I0814 13:24:33.326092  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:aggregate-to-admin: (648.28µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.327896  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.403424ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.328148  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-admin
I0814 13:24:33.329531  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:aggregate-to-edit: (1.230609ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.331905  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.702143ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.332325  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-edit
I0814 13:24:33.333478  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:aggregate-to-view: (887.542µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.335330  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.373494ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.335611  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-view
I0814 13:24:33.336605  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:heapster: (675.779µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.339308  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.978355ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.339578  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:heapster
I0814 13:24:33.340585  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:node: (681.698µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.342442  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.402384ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.342887  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:node
I0814 13:24:33.344210  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:node-problem-detector: (981.411µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.345502  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (978.173µs) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.345807  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:node-problem-detector
I0814 13:24:33.346607  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:node-proxier: (619.423µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.348179  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.103243ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.348460  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:node-proxier
I0814 13:24:33.349396  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:kubelet-api-admin: (692.053µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.351173  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.399771ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.351340  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:kubelet-api-admin
I0814 13:24:33.352297  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:node-bootstrapper: (759.014µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.353903  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.163047ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.354280  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:node-bootstrapper
I0814 13:24:33.355255  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:auth-delegator: (754.333µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.356935  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.237474ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.357302  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:auth-delegator
I0814 13:24:33.358683  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:kube-aggregator: (1.214583ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.361040  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.776696ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.361199  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:kube-aggregator
I0814 13:24:33.362031  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:kube-controller-manager: (661.848µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.363538  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.029872ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.363796  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:kube-controller-manager
I0814 13:24:33.364681  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:kube-dns: (675.841µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.366244  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.084024ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.366455  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:kube-dns
I0814 13:24:33.367415  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:persistent-volume-provisioner: (730.571µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.369036  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.148073ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.369226  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:persistent-volume-provisioner
I0814 13:24:33.370231  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:csi-external-attacher: (831.078µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.371771  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.138884ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.371975  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:csi-external-attacher
I0814 13:24:33.372998  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:certificates.k8s.io:certificatesigningrequests:nodeclient: (807.135µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.374681  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.361661ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.374930  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:certificates.k8s.io:certificatesigningrequests:nodeclient
I0814 13:24:33.376292  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:certificates.k8s.io:certificatesigningrequests:selfnodeclient: (829.752µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.377955  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.36805ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.378205  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:certificates.k8s.io:certificatesigningrequests:selfnodeclient
I0814 13:24:33.379462  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:volume-scheduler: (940.641µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.381238  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.203903ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.381435  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:volume-scheduler
I0814 13:24:33.382475  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:kube-scheduler: (802.093µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.384291  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.244013ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.384509  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:kube-scheduler
I0814 13:24:33.385486  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:csi-external-provisioner: (802.175µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.387879  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.968125ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.388134  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:csi-external-provisioner
I0814 13:24:33.389464  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:attachdetach-controller: (1.133185ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.391451  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.481914ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.391737  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:attachdetach-controller
I0814 13:24:33.392871  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:clusterrole-aggregation-controller: (771.397µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.393338  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.393361  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.393382  108836 httplog.go:90] GET /healthz: (649.196µs) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:33.394391  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.193645ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.394557  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:clusterrole-aggregation-controller
I0814 13:24:33.395430  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:cronjob-controller: (637.932µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.397096  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.180839ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.397294  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:cronjob-controller
I0814 13:24:33.398690  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:daemon-set-controller: (1.183741ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.401064  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.526878ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.401289  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.401309  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:daemon-set-controller
I0814 13:24:33.401310  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.401346  108836 httplog.go:90] GET /healthz: (1.086196ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.402354  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:deployment-controller: (766.751µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.404144  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.312327ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.404423  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:deployment-controller
I0814 13:24:33.405371  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:disruption-controller: (823.571µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.409399  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (3.789692ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.409592  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:disruption-controller
I0814 13:24:33.410666  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:endpoint-controller: (856.939µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.412824  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.330491ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.413029  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:endpoint-controller
I0814 13:24:33.414096  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:expand-controller: (910.939µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.415924  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.28687ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.416234  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:expand-controller
I0814 13:24:33.417586  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:generic-garbage-collector: (1.197318ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.421391  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.749821ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.421665  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:generic-garbage-collector
I0814 13:24:33.427041  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:horizontal-pod-autoscaler: (5.089805ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.429698  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (2.131893ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.429951  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:horizontal-pod-autoscaler
I0814 13:24:33.430919  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:job-controller: (764.139µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.433492  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (2.049043ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.433783  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:job-controller
I0814 13:24:33.434684  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:namespace-controller: (711.293µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.436257  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.089154ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.436502  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:namespace-controller
I0814 13:24:33.437586  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:node-controller: (906.835µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.439087  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.121736ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.439306  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:node-controller
I0814 13:24:33.441513  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:persistent-volume-binder: (1.949241ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.443748  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.858906ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.444032  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:persistent-volume-binder
I0814 13:24:33.445747  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:pod-garbage-collector: (790.511µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.446994  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (995.827µs) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.447286  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:pod-garbage-collector
I0814 13:24:33.448188  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:replicaset-controller: (766.493µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.450076  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.462149ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.450419  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:replicaset-controller
I0814 13:24:33.451395  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:replication-controller: (750.28µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.456912  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (5.264642ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.457487  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:replication-controller
I0814 13:24:33.458549  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:resourcequota-controller: (832.105µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.460517  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.542172ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.460998  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:resourcequota-controller
I0814 13:24:33.461933  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:route-controller: (725.327µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.463625  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.060654ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.463789  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:route-controller
I0814 13:24:33.464525  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:service-account-controller: (548.171µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.465915  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.083696ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.466112  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:service-account-controller
I0814 13:24:33.466996  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:service-controller: (650.856µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.468241  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (925.756µs) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.468612  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:service-controller
I0814 13:24:33.469369  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:statefulset-controller: (638.392µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.470812  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.164076ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.471110  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:statefulset-controller
I0814 13:24:33.471932  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:ttl-controller: (659.788µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.473714  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.526533ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.474009  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:ttl-controller
I0814 13:24:33.493198  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:certificate-controller: (901.52µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.494016  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.494154  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.494398  108836 httplog.go:90] GET /healthz: (1.293372ms) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:33.501295  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.501324  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.501358  108836 httplog.go:90] GET /healthz: (1.023179ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.513506  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.237703ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.513906  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:certificate-controller
I0814 13:24:33.533140  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:pvc-protection-controller: (866.893µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.554047  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (1.685156ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.554262  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:pvc-protection-controller
I0814 13:24:33.573282  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterroles/system:controller:pv-protection-controller: (971.612µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.593490  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.593521  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.593553  108836 httplog.go:90] GET /healthz: (784.067µs) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:33.594499  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterroles: (2.231622ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.594754  108836 storage_rbac.go:219] created clusterrole.rbac.authorization.k8s.io/system:controller:pv-protection-controller
I0814 13:24:33.600925  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.600950  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.600989  108836 httplog.go:90] GET /healthz: (713.288µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.613123  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/cluster-admin: (846.154µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.633800  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.490587ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.634076  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/cluster-admin
I0814 13:24:33.653156  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:discovery: (876.565µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.673570  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.250424ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.673878  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:discovery
I0814 13:24:33.693123  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:basic-user: (856.368µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.693712  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.693733  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.693767  108836 httplog.go:90] GET /healthz: (932.11µs) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:33.701010  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.701045  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.701073  108836 httplog.go:90] GET /healthz: (736.779µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.713779  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.580561ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.713999  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:basic-user
I0814 13:24:33.733187  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:public-info-viewer: (917.79µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.753544  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.266514ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.753699  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:public-info-viewer
I0814 13:24:33.772989  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:node-proxier: (705.636µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.793489  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.176686ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.793683  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:node-proxier
I0814 13:24:33.793702  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.793764  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.793822  108836 httplog.go:90] GET /healthz: (1.082263ms) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:33.800939  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.800963  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.801003  108836 httplog.go:90] GET /healthz: (723.944µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.813234  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:kube-controller-manager: (974.096µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.833614  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.314833ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.833896  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-controller-manager
I0814 13:24:33.853855  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:kube-dns: (1.110875ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.873672  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.370453ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.874179  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-dns
I0814 13:24:33.893333  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:kube-scheduler: (1.044489ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:33.893731  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.893842  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.893929  108836 httplog.go:90] GET /healthz: (1.183441ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:33.901039  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.901072  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.901213  108836 httplog.go:90] GET /healthz: (779.811µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.914329  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (2.024647ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.914661  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-scheduler
I0814 13:24:33.933527  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:volume-scheduler: (1.053983ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.954121  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.76494ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.955962  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:volume-scheduler
I0814 13:24:33.973506  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:node: (1.08434ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.993549  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.282219ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:33.993736  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:33.993769  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:33.993801  108836 httplog.go:90] GET /healthz: (1.054382ms) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:33.994326  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:node
I0814 13:24:34.001041  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.001065  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.001102  108836 httplog.go:90] GET /healthz: (813.031µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.013578  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:attachdetach-controller: (1.274693ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.033870  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.505265ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.034219  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:attachdetach-controller
I0814 13:24:34.053484  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:clusterrole-aggregation-controller: (1.166864ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.073809  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.435851ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.074097  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:clusterrole-aggregation-controller
I0814 13:24:34.093310  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:cronjob-controller: (983.34µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.093510  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.093531  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.093559  108836 httplog.go:90] GET /healthz: (754.162µs) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:34.100886  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.100912  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.100949  108836 httplog.go:90] GET /healthz: (685.935µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.113757  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.48581ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.114011  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:cronjob-controller
I0814 13:24:34.133255  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:daemon-set-controller: (1.032405ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.153748  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.435655ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.153965  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:daemon-set-controller
I0814 13:24:34.173297  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:deployment-controller: (966.87µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.193640  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.348688ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.193968  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:deployment-controller
I0814 13:24:34.193985  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.194025  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.194083  108836 httplog.go:90] GET /healthz: (1.286901ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:34.201050  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.201076  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.201109  108836 httplog.go:90] GET /healthz: (772.45µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.213097  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:disruption-controller: (807.464µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.233655  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.35709ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.233887  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:disruption-controller
I0814 13:24:34.253079  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:endpoint-controller: (828.249µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.273406  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.131953ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.273573  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:endpoint-controller
I0814 13:24:34.293113  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:expand-controller: (844.082µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.293473  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.293495  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.293548  108836 httplog.go:90] GET /healthz: (780.645µs) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:34.302752  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.302775  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.302807  108836 httplog.go:90] GET /healthz: (624.906µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.313363  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.089411ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.313542  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:expand-controller
I0814 13:24:34.333072  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:generic-garbage-collector: (802.08µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.353638  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.370293ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.353887  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:generic-garbage-collector
I0814 13:24:34.373296  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:horizontal-pod-autoscaler: (981.587µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.393523  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.393549  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.393591  108836 httplog.go:90] GET /healthz: (822.794µs) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:34.394688  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (2.396659ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.394958  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:horizontal-pod-autoscaler
I0814 13:24:34.400885  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.400908  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.400934  108836 httplog.go:90] GET /healthz: (691.281µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.413084  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:job-controller: (790.344µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.433434  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.169635ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.433722  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:job-controller
I0814 13:24:34.453100  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:namespace-controller: (808.781µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.473638  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.340693ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.474019  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:namespace-controller
I0814 13:24:34.493850  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.493881  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.493898  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:node-controller: (1.607163ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.493912  108836 httplog.go:90] GET /healthz: (891.498µs) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:34.500927  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.500948  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.500972  108836 httplog.go:90] GET /healthz: (705.452µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.513595  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.32589ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.513795  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:node-controller
I0814 13:24:34.533281  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:persistent-volume-binder: (851.161µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.553917  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.602565ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.554139  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:persistent-volume-binder
I0814 13:24:34.573403  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:pod-garbage-collector: (1.083485ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.593991  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.594021  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.594040  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.673998ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.594048  108836 httplog.go:90] GET /healthz: (1.266716ms) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:34.594261  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pod-garbage-collector
I0814 13:24:34.601072  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.601101  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.601150  108836 httplog.go:90] GET /healthz: (837.954µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.613077  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:replicaset-controller: (791.79µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.633674  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.266532ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.633900  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:replicaset-controller
I0814 13:24:34.653308  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:replication-controller: (977.72µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.673970  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.644383ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.674181  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:replication-controller
I0814 13:24:34.693208  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:resourcequota-controller: (860.793µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49104]
I0814 13:24:34.694110  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.694136  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.694172  108836 httplog.go:90] GET /healthz: (1.091494ms) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:34.700865  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.700888  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.700925  108836 httplog.go:90] GET /healthz: (605.393µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.713820  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.513287ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.714168  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:resourcequota-controller
I0814 13:24:34.733443  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:route-controller: (1.088361ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.753975  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.57019ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.754199  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:route-controller
I0814 13:24:34.773445  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:service-account-controller: (1.120291ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.793877  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.793900  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.793924  108836 httplog.go:90] GET /healthz: (1.170783ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:34.794342  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.691506ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.794718  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:service-account-controller
I0814 13:24:34.800921  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.801240  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.801532  108836 httplog.go:90] GET /healthz: (1.187142ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.813076  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:service-controller: (838.171µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.834133  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.521635ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.834361  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:service-controller
I0814 13:24:34.853139  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:statefulset-controller: (788.17µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
E0814 13:24:34.860864  108836 grpc_service.go:71] failed to create connection to unix socket: @kms-provider-2.sock, error: dial unix @kms-provider-2.sock: connect: connection refused
I0814 13:24:34.860890  108836 balancer_conn_wrappers.go:131] pickfirstBalancer: HandleSubConnStateChange: 0xc0087d4bb0, CONNECTING
W0814 13:24:34.860913  108836 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {@kms-provider-2.sock 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial unix @kms-provider-2.sock: connect: connection refused". Reconnecting...
I0814 13:24:34.860949  108836 balancer_conn_wrappers.go:131] pickfirstBalancer: HandleSubConnStateChange: 0xc0087d4bb0, TRANSIENT_FAILURE
I0814 13:24:34.873791  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.455589ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.874063  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:statefulset-controller
I0814 13:24:34.893263  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:ttl-controller: (904.929µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.893398  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.893610  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.893644  108836 httplog.go:90] GET /healthz: (868.534µs) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:34.900853  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.900880  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.900906  108836 httplog.go:90] GET /healthz: (645.574µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.913586  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.274802ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.913793  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:ttl-controller
I0814 13:24:34.934252  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:certificate-controller: (1.823552ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.953998  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (1.617482ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.954284  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:certificate-controller
I0814 13:24:34.973556  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:pvc-protection-controller: (1.132376ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.993806  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:34.993896  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:34.994011  108836 httplog.go:90] GET /healthz: (1.181411ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:34.997009  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (4.559334ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:34.997275  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pvc-protection-controller
I0814 13:24:35.001368  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.001389  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.001475  108836 httplog.go:90] GET /healthz: (1.126674ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.013777  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/clusterrolebindings/system:controller:pv-protection-controller: (1.447528ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.035243  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/clusterrolebindings: (2.804836ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.035448  108836 storage_rbac.go:247] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pv-protection-controller
I0814 13:24:35.053914  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles/extension-apiserver-authentication-reader: (1.477129ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.056356  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.740057ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.074224  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles: (1.846655ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.074413  108836 storage_rbac.go:278] created role.rbac.authorization.k8s.io/extension-apiserver-authentication-reader in kube-system
I0814 13:24:35.094631  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.094781  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.094854  108836 httplog.go:90] GET /healthz: (2.043722ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:35.095873  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles/system:controller:bootstrap-signer: (3.479969ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.098357  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.754475ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.101346  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.101377  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.101411  108836 httplog.go:90] GET /healthz: (1.084388ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.113772  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles: (1.480245ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.114009  108836 storage_rbac.go:278] created role.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-system
I0814 13:24:35.134188  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles/system:controller:cloud-provider: (1.607475ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.135960  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.381007ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.157963  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles: (5.63655ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.158198  108836 storage_rbac.go:278] created role.rbac.authorization.k8s.io/system:controller:cloud-provider in kube-system
I0814 13:24:35.179101  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles/system:controller:token-cleaner: (6.671254ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.181496  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.948715ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.194492  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles: (2.164178ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.194744  108836 storage_rbac.go:278] created role.rbac.authorization.k8s.io/system:controller:token-cleaner in kube-system
I0814 13:24:35.200238  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.200266  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.200301  108836 httplog.go:90] GET /healthz: (7.307034ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:35.201158  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.201261  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.201308  108836 httplog.go:90] GET /healthz: (1.003059ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.215558  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles/system::leader-locking-kube-controller-manager: (3.212573ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.217278  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.136021ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.234097  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles: (1.74439ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.234367  108836 storage_rbac.go:278] created role.rbac.authorization.k8s.io/system::leader-locking-kube-controller-manager in kube-system
I0814 13:24:35.253980  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles/system::leader-locking-kube-scheduler: (1.651092ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.256083  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.329075ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.276586  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/roles: (4.084452ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.276957  108836 storage_rbac.go:278] created role.rbac.authorization.k8s.io/system::leader-locking-kube-scheduler in kube-system
I0814 13:24:35.293317  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-public/roles/system:controller:bootstrap-signer: (990.982µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.294437  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.294473  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.294506  108836 httplog.go:90] GET /healthz: (1.696396ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:35.295278  108836 httplog.go:90] GET /api/v1/namespaces/kube-public: (1.05369ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.301111  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.301143  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.301183  108836 httplog.go:90] GET /healthz: (867.026µs) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.315763  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-public/roles: (3.419925ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.316981  108836 storage_rbac.go:278] created role.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-public
I0814 13:24:35.334568  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings/system::extension-apiserver-authentication-reader: (2.184172ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.336065  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.078839ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.355073  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings: (2.363427ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.355285  108836 storage_rbac.go:308] created rolebinding.rbac.authorization.k8s.io/system::extension-apiserver-authentication-reader in kube-system
I0814 13:24:35.373663  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings/system::leader-locking-kube-controller-manager: (1.347524ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.393970  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (19.817308ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.396652  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.396676  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.396707  108836 httplog.go:90] GET /healthz: (3.89979ms) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:35.397067  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings: (2.767242ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.398951  108836 storage_rbac.go:308] created rolebinding.rbac.authorization.k8s.io/system::leader-locking-kube-controller-manager in kube-system
I0814 13:24:35.402163  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.402181  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.402204  108836 httplog.go:90] GET /healthz: (1.766838ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.413588  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings/system::leader-locking-kube-scheduler: (1.152819ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.416410  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (2.446889ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.434298  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings: (1.921432ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.434534  108836 storage_rbac.go:308] created rolebinding.rbac.authorization.k8s.io/system::leader-locking-kube-scheduler in kube-system
I0814 13:24:35.454374  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings/system:controller:bootstrap-signer: (2.004955ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.456578  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.791704ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.475187  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings: (1.795045ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.475410  108836 storage_rbac.go:308] created rolebinding.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-system
I0814 13:24:35.494264  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.494293  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.494331  108836 httplog.go:90] GET /healthz: (1.209957ms) 0 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:35.496116  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings/system:controller:cloud-provider: (987.776µs) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.498986  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (2.495118ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.501580  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.501613  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.501641  108836 httplog.go:90] GET /healthz: (1.218079ms) 0 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.515275  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings: (2.003541ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.515607  108836 storage_rbac.go:308] created rolebinding.rbac.authorization.k8s.io/system:controller:cloud-provider in kube-system
I0814 13:24:35.537295  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings/system:controller:token-cleaner: (3.007021ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.539319  108836 httplog.go:90] GET /api/v1/namespaces/kube-system: (1.616233ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.557885  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-system/rolebindings: (4.693224ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.558306  108836 storage_rbac.go:308] created rolebinding.rbac.authorization.k8s.io/system:controller:token-cleaner in kube-system
I0814 13:24:35.574312  108836 httplog.go:90] GET /apis/rbac.authorization.k8s.io/v1/namespaces/kube-public/rolebindings/system:controller:bootstrap-signer: (1.553044ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.576104  108836 httplog.go:90] GET /api/v1/namespaces/kube-public: (1.330582ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.593954  108836 healthz.go:169] healthz check poststarthook/rbac/bootstrap-roles failed: not finished
I0814 13:24:35.593976  108836 healthz.go:183] [+]ping ok
[+]log ok
[+]etcd ok
[+]poststarthook/generic-apiserver-start-informers ok
[+]poststarthook/bootstrap-controller ok
[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
[+]poststarthook/ca-registration ok
healthz check failed
I0814 13:24:35.594007  108836 httplog.go:90] GET /healthz: (804.094µs) 0 [Go-http-client/1.1 127.0.0.1:49104]
I0814 13:24:35.597913  108836 httplog.go:90] POST /apis/rbac.authorization.k8s.io/v1/namespaces/kube-public/rolebindings: (5.543332ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.598120  108836 storage_rbac.go:308] created rolebinding.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-public
I0814 13:24:35.603096  108836 httplog.go:90] GET /healthz: (2.754284ms) 200 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.606417  108836 httplog.go:90] GET /api/v1/namespaces/default: (3.025954ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.608950  108836 httplog.go:90] POST /api/v1/namespaces: (2.259228ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.610263  108836 httplog.go:90] GET /api/v1/namespaces/default/services/kubernetes: (1.03808ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.622464  108836 httplog.go:90] POST /api/v1/namespaces/default/services: (11.858974ms) 201 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.624941  108836 httplog.go:90] GET /api/v1/namespaces/default/endpoints/kubernetes: (2.128975ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
I0814 13:24:35.625804  108836 httplog.go:90] POST /api/v1/namespaces/default/endpoints: (476.583µs) 422 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
E0814 13:24:35.626101  108836 controller.go:218] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: [subsets[0].addresses[0].ip: Invalid value: "<nil>": must be a valid IP address, (e.g. 10.9.8.7), subsets[0].addresses[0].ip: Invalid value: "<nil>": must be a valid IP address]
I0814 13:24:35.694759  108836 httplog.go:90] GET /healthz: (1.849754ms) 200 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:35.700426  108836 httplog.go:90] GET /api/v1/namespaces/default/pods: (5.085141ms) 200 [Go-http-client/1.1 127.0.0.1:49102]
I0814 13:24:35.700882  108836 controller.go:176] Shutting down kubernetes service endpoint reconciler
I0814 13:24:35.703905  108836 httplog.go:90] GET /api/v1/namespaces/default/endpoints/kubernetes: (2.770315ms) 404 [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:49102]
--- FAIL: TestEmptyList (3.55s)
    synthetic_master_test.go:139: body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"37899"},"items":null}
    synthetic_master_test.go:140: nil items field from empty list (all lists should return non-nil empty items lists)

				from junit_eb089aee80105aff5db0557ae4449d31f19359f2_20190814-131703.xml

Filter through log files | View test history on testgrid


Show 2448 Passed Tests

Show 3 Skipped Tests

Error lines from build-log.txt

... skipping 719 lines ...
W0814 13:12:09.491] I0814 13:12:09.491091   53127 controllermanager.go:535] Started "namespace"
W0814 13:12:09.491] I0814 13:12:09.491189   53127 namespace_controller.go:186] Starting namespace controller
W0814 13:12:09.492] I0814 13:12:09.491297   53127 controller_utils.go:1029] Waiting for caches to sync for namespace controller
W0814 13:12:09.492] I0814 13:12:09.491497   53127 controllermanager.go:535] Started "serviceaccount"
W0814 13:12:09.492] I0814 13:12:09.491703   53127 controllermanager.go:535] Started "csrcleaner"
W0814 13:12:09.492] I0814 13:12:09.492071   53127 node_lifecycle_controller.go:77] Sending events to api server
W0814 13:12:09.493] E0814 13:12:09.492090   53127 core.go:175] failed to start cloud node lifecycle controller: no cloud provider provided
W0814 13:12:09.493] W0814 13:12:09.492108   53127 controllermanager.go:527] Skipping "cloud-node-lifecycle"
W0814 13:12:09.493] I0814 13:12:09.492248   53127 serviceaccounts_controller.go:117] Starting service account controller
W0814 13:12:09.493] I0814 13:12:09.492278   53127 controller_utils.go:1029] Waiting for caches to sync for service account controller
W0814 13:12:09.493] I0814 13:12:09.492312   53127 cleaner.go:81] Starting CSR cleaner controller
W0814 13:12:09.493] I0814 13:12:09.492387   53127 controllermanager.go:535] Started "pv-protection"
W0814 13:12:09.494] I0814 13:12:09.492662   53127 controllermanager.go:535] Started "endpoint"
... skipping 29 lines ...
W0814 13:12:09.887] I0814 13:12:09.802469   53127 replica_set.go:182] Starting replicaset controller
W0814 13:12:09.887] I0814 13:12:09.802883   53127 controller_utils.go:1029] Waiting for caches to sync for ReplicaSet controller
W0814 13:12:09.888] I0814 13:12:09.802933   53127 controllermanager.go:535] Started "statefulset"
W0814 13:12:09.888] W0814 13:12:09.803194   53127 controllermanager.go:514] "tokencleaner" is disabled
W0814 13:12:09.888] I0814 13:12:09.802939   53127 stateful_set.go:145] Starting stateful set controller
W0814 13:12:09.888] I0814 13:12:09.803550   53127 controller_utils.go:1029] Waiting for caches to sync for stateful set controller
W0814 13:12:09.888] E0814 13:12:09.803935   53127 core.go:78] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0814 13:12:09.888] W0814 13:12:09.804140   53127 controllermanager.go:527] Skipping "service"
W0814 13:12:09.888] I0814 13:12:09.804274   53127 core.go:185] Will not configure cloud provider routes for allocate-node-cidrs: false, configure-cloud-routes: true.
W0814 13:12:09.889] W0814 13:12:09.804402   53127 controllermanager.go:527] Skipping "route"
W0814 13:12:09.889] W0814 13:12:09.804538   53127 controllermanager.go:527] Skipping "ttl-after-finished"
W0814 13:12:09.889] I0814 13:12:09.805043   53127 controllermanager.go:535] Started "deployment"
W0814 13:12:09.889] I0814 13:12:09.805153   53127 deployment_controller.go:152] Starting deployment controller
... skipping 44 lines ...
W0814 13:12:09.968] I0814 13:12:09.963909   53127 controllermanager.go:535] Started "disruption"
W0814 13:12:09.969] I0814 13:12:09.964205   53127 controllermanager.go:535] Started "csrapproving"
W0814 13:12:09.969] I0814 13:12:09.964721   53127 disruption.go:333] Starting disruption controller
W0814 13:12:09.969] I0814 13:12:09.964754   53127 controller_utils.go:1029] Waiting for caches to sync for disruption controller
W0814 13:12:09.969] I0814 13:12:09.964773   53127 certificate_controller.go:113] Starting certificate controller
W0814 13:12:09.969] I0814 13:12:09.964783   53127 controller_utils.go:1029] Waiting for caches to sync for certificate controller
W0814 13:12:09.991] W0814 13:12:09.990126   53127 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0814 13:12:10.009] I0814 13:12:10.009119   53127 controller_utils.go:1036] Caches are synced for TTL controller
W0814 13:12:10.009] I0814 13:12:10.009422   53127 controller_utils.go:1036] Caches are synced for persistent volume controller
W0814 13:12:10.010] I0814 13:12:10.009945   53127 controller_utils.go:1036] Caches are synced for ClusterRoleAggregator controller
W0814 13:12:10.010] I0814 13:12:10.010601   53127 controller_utils.go:1036] Caches are synced for PVC protection controller
W0814 13:12:10.022] E0814 13:12:10.022232   53127 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
W0814 13:12:10.032] E0814 13:12:10.031822   53127 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W0814 13:12:10.065] I0814 13:12:10.064943   53127 controller_utils.go:1036] Caches are synced for certificate controller
W0814 13:12:10.082] I0814 13:12:10.081526   53127 controller_utils.go:1036] Caches are synced for GC controller
W0814 13:12:10.083] I0814 13:12:10.083556   53127 controller_utils.go:1036] Caches are synced for taint controller
W0814 13:12:10.084] I0814 13:12:10.083663   53127 node_lifecycle_controller.go:1189] Initializing eviction metric for zone: 
W0814 13:12:10.084] I0814 13:12:10.083758   53127 taint_manager.go:186] Starting NoExecuteTaintManager
W0814 13:12:10.084] I0814 13:12:10.083915   53127 controller_utils.go:1036] Caches are synced for expand controller
... skipping 102 lines ...
I0814 13:12:13.675] +++ working dir: /go/src/k8s.io/kubernetes
I0814 13:12:13.678] +++ command: run_RESTMapper_evaluation_tests
I0814 13:12:13.689] +++ [0814 13:12:13] Creating namespace namespace-1565788333-573
I0814 13:12:13.757] namespace/namespace-1565788333-573 created
I0814 13:12:13.823] Context "test" modified.
I0814 13:12:13.829] +++ [0814 13:12:13] Testing RESTMapper
I0814 13:12:13.923] +++ [0814 13:12:13] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I0814 13:12:13.936] +++ exit code: 0
I0814 13:12:14.042] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I0814 13:12:14.043] bindings                                                                      true         Binding
I0814 13:12:14.043] componentstatuses                 cs                                          false        ComponentStatus
I0814 13:12:14.043] configmaps                        cm                                          true         ConfigMap
I0814 13:12:14.043] endpoints                         ep                                          true         Endpoints
... skipping 663 lines ...
I0814 13:12:31.942] (Bpoddisruptionbudget.policy/test-pdb-3 created
I0814 13:12:32.028] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I0814 13:12:32.094] (Bpoddisruptionbudget.policy/test-pdb-4 created
I0814 13:12:32.178] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I0814 13:12:32.330] (Bcore.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:12:32.506] (Bpod/env-test-pod created
W0814 13:12:32.606] error: resource(s) were provided, but no name, label selector, or --all flag specified
W0814 13:12:32.607] error: setting 'all' parameter but found a non empty selector. 
W0814 13:12:32.607] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0814 13:12:32.607] I0814 13:12:31.623890   49644 controller.go:606] quota admission added evaluator for: poddisruptionbudgets.policy
W0814 13:12:32.608] error: min-available and max-unavailable cannot be both specified
I0814 13:12:32.708] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I0814 13:12:32.708] Name:         env-test-pod
I0814 13:12:32.709] Namespace:    test-kubectl-describe-pod
I0814 13:12:32.709] Priority:     0
I0814 13:12:32.709] Node:         <none>
I0814 13:12:32.709] Labels:       <none>
... skipping 173 lines ...
I0814 13:12:45.544] (Bpod/valid-pod patched
I0814 13:12:45.634] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I0814 13:12:45.704] (Bpod/valid-pod patched
I0814 13:12:45.791] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I0814 13:12:45.934] (Bpod/valid-pod patched
I0814 13:12:46.031] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0814 13:12:46.196] (B+++ [0814 13:12:46] "kubectl patch with resourceVersion 495" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
I0814 13:12:46.429] pod "valid-pod" deleted
I0814 13:12:46.440] pod/valid-pod replaced
I0814 13:12:46.532] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I0814 13:12:46.672] (BSuccessful
I0814 13:12:46.673] message:error: --grace-period must have --force specified
I0814 13:12:46.673] has:\-\-grace-period must have \-\-force specified
I0814 13:12:46.826] Successful
I0814 13:12:46.827] message:error: --timeout must have --force specified
I0814 13:12:46.827] has:\-\-timeout must have \-\-force specified
I0814 13:12:46.980] node/node-v1-test created
W0814 13:12:47.081] W0814 13:12:46.979726   53127 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0814 13:12:47.181] node/node-v1-test replaced
I0814 13:12:47.233] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I0814 13:12:47.306] (Bnode "node-v1-test" deleted
I0814 13:12:47.400] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0814 13:12:47.672] (Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
I0814 13:12:48.596] (Bcore.sh:575: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
... skipping 25 lines ...
I0814 13:12:49.211] (Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
I0814 13:12:49.330] (Bpod/valid-pod labeled
W0814 13:12:49.431] Edit cancelled, no changes made.
W0814 13:12:49.432] Edit cancelled, no changes made.
W0814 13:12:49.432] Edit cancelled, no changes made.
W0814 13:12:49.432] Edit cancelled, no changes made.
W0814 13:12:49.433] error: 'name' already has a value (valid-pod), and --overwrite is false
I0814 13:12:49.533] core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
I0814 13:12:49.587] (Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0814 13:12:49.704] (Bpod "valid-pod" force deleted
W0814 13:12:49.805] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0814 13:12:49.905] core.sh:605: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:12:49.906] (B+++ [0814 13:12:49] Creating namespace namespace-1565788369-21586
... skipping 82 lines ...
I0814 13:12:57.766] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I0814 13:12:57.768] +++ working dir: /go/src/k8s.io/kubernetes
I0814 13:12:57.770] +++ command: run_kubectl_create_error_tests
I0814 13:12:57.781] +++ [0814 13:12:57] Creating namespace namespace-1565788377-5220
I0814 13:12:57.846] namespace/namespace-1565788377-5220 created
I0814 13:12:57.907] Context "test" modified.
I0814 13:12:57.913] +++ [0814 13:12:57] Testing kubectl create with error
W0814 13:12:58.013] Error: must specify one of -f and -k
W0814 13:12:58.013] 
W0814 13:12:58.014] Create a resource from a file or from stdin.
W0814 13:12:58.014] 
W0814 13:12:58.014]  JSON and YAML formats are accepted.
W0814 13:12:58.014] 
W0814 13:12:58.014] Examples:
... skipping 41 lines ...
W0814 13:12:58.019] 
W0814 13:12:58.019] Usage:
W0814 13:12:58.019]   kubectl create -f FILENAME [options]
W0814 13:12:58.019] 
W0814 13:12:58.019] Use "kubectl <command> --help" for more information about a given command.
W0814 13:12:58.019] Use "kubectl options" for a list of global command-line options (applies to all commands).
I0814 13:12:58.127] +++ [0814 13:12:58] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W0814 13:12:58.228] kubectl convert is DEPRECATED and will be removed in a future version.
W0814 13:12:58.229] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0814 13:12:58.329] +++ exit code: 0
I0814 13:12:58.330] Recording: run_kubectl_apply_tests
I0814 13:12:58.330] Running command: run_kubectl_apply_tests
I0814 13:12:58.337] 
... skipping 19 lines ...
W0814 13:13:00.247] I0814 13:13:00.246565   49644 client.go:354] parsed scheme: ""
W0814 13:13:00.248] I0814 13:13:00.246604   49644 client.go:354] scheme "" not registered, fallback to default scheme
W0814 13:13:00.248] I0814 13:13:00.246663   49644 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0814 13:13:00.248] I0814 13:13:00.246707   49644 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0814 13:13:00.248] I0814 13:13:00.247335   49644 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0814 13:13:00.249] I0814 13:13:00.249577   49644 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
W0814 13:13:00.329] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I0814 13:13:00.430] kind.mygroup.example.com/myobj serverside-applied (server dry run)
I0814 13:13:00.430] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0814 13:13:00.430] +++ exit code: 0
I0814 13:13:00.460] Recording: run_kubectl_run_tests
I0814 13:13:00.460] Running command: run_kubectl_run_tests
I0814 13:13:00.481] 
... skipping 84 lines ...
I0814 13:13:02.777] Context "test" modified.
I0814 13:13:02.784] +++ [0814 13:13:02] Testing kubectl create filter
I0814 13:13:02.876] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:03.043] (Bpod/selector-test-pod created
I0814 13:13:03.143] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I0814 13:13:03.227] (BSuccessful
I0814 13:13:03.228] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I0814 13:13:03.228] has:pods "selector-test-pod-dont-apply" not found
I0814 13:13:03.305] pod "selector-test-pod" deleted
I0814 13:13:03.323] +++ exit code: 0
I0814 13:13:03.358] Recording: run_kubectl_apply_deployments_tests
I0814 13:13:03.358] Running command: run_kubectl_apply_deployments_tests
I0814 13:13:03.379] 
... skipping 42 lines ...
W0814 13:13:05.521] I0814 13:13:05.424691   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788383-26214", Name:"nginx", UID:"f3dc8117-8833-428e-aab0-51ddedd3fa92", APIVersion:"apps/v1", ResourceVersion:"573", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7dbc4d9f to 3
W0814 13:13:05.521] I0814 13:13:05.428971   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-7dbc4d9f", UID:"606c19d9-3880-498c-90b8-1ada9e9d8fca", APIVersion:"apps/v1", ResourceVersion:"574", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7dbc4d9f-t5xb9
W0814 13:13:05.522] I0814 13:13:05.433685   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-7dbc4d9f", UID:"606c19d9-3880-498c-90b8-1ada9e9d8fca", APIVersion:"apps/v1", ResourceVersion:"574", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7dbc4d9f-65m7n
W0814 13:13:05.522] I0814 13:13:05.433960   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-7dbc4d9f", UID:"606c19d9-3880-498c-90b8-1ada9e9d8fca", APIVersion:"apps/v1", ResourceVersion:"574", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7dbc4d9f-q75xc
I0814 13:13:05.623] apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
I0814 13:13:09.768] (BSuccessful
I0814 13:13:09.769] message:Error from server (Conflict): error when applying patch:
I0814 13:13:09.770] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1565788383-26214\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I0814 13:13:09.770] to:
I0814 13:13:09.770] Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
I0814 13:13:09.770] Name: "nginx", Namespace: "namespace-1565788383-26214"
I0814 13:13:09.774] Object: &{map["apiVersion":"apps/v1" "kind":"Deployment" "metadata":map["annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1565788383-26214\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx1\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "creationTimestamp":"2019-08-14T13:13:05Z" "generation":'\x01' "labels":map["name":"nginx"] "managedFields":[map["apiVersion":"apps/v1" "fields":map["f:metadata":map["f:annotations":map["f:deployment.kubernetes.io/revision":map[]]] "f:status":map["f:conditions":map[".":map[] "k:{\"type\":\"Available\"}":map[".":map[] "f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[]] "k:{\"type\":\"Progressing\"}":map[".":map[] "f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[]]] "f:observedGeneration":map[] "f:replicas":map[] "f:unavailableReplicas":map[] "f:updatedReplicas":map[]]] "manager":"kube-controller-manager" "operation":"Update" "time":"2019-08-14T13:13:05Z"] map["apiVersion":"apps/v1" "fields":map["f:metadata":map["f:annotations":map[".":map[] "f:kubectl.kubernetes.io/last-applied-configuration":map[]] "f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:progressDeadlineSeconds":map[] "f:replicas":map[] "f:revisionHistoryLimit":map[] "f:selector":map["f:matchLabels":map[".":map[] "f:name":map[]]] "f:strategy":map["f:rollingUpdate":map[".":map[] "f:maxSurge":map[] "f:maxUnavailable":map[]] "f:type":map[]] "f:template":map["f:metadata":map["f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:containers":map["k:{\"name\":\"nginx\"}":map[".":map[] "f:image":map[] "f:imagePullPolicy":map[] "f:name":map[] "f:ports":map[".":map[] "k:{\"containerPort\":80,\"protocol\":\"TCP\"}":map[".":map[] "f:containerPort":map[] "f:protocol":map[]]] "f:resources":map[] "f:terminationMessagePath":map[] "f:terminationMessagePolicy":map[]]] "f:dnsPolicy":map[] "f:restartPolicy":map[] "f:schedulerName":map[] "f:securityContext":map[] "f:terminationGracePeriodSeconds":map[]]]]] "manager":"kubectl" "operation":"Update" "time":"2019-08-14T13:13:05Z"]] "name":"nginx" "namespace":"namespace-1565788383-26214" "resourceVersion":"586" "uid":"f3dc8117-8833-428e-aab0-51ddedd3fa92"] "spec":map["progressDeadlineSeconds":'\u0258' "replicas":'\x03' "revisionHistoryLimit":'\n' "selector":map["matchLabels":map["name":"nginx1"]] "strategy":map["rollingUpdate":map["maxSurge":"25%" "maxUnavailable":"25%"] "type":"RollingUpdate"] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "imagePullPolicy":"IfNotPresent" "name":"nginx" "ports":[map["containerPort":'P' "protocol":"TCP"]] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File"]] "dnsPolicy":"ClusterFirst" "restartPolicy":"Always" "schedulerName":"default-scheduler" "securityContext":map[] "terminationGracePeriodSeconds":'\x1e']]] "status":map["conditions":[map["lastTransitionTime":"2019-08-14T13:13:05Z" "lastUpdateTime":"2019-08-14T13:13:05Z" "message":"Deployment does not have minimum availability." "reason":"MinimumReplicasUnavailable" "status":"False" "type":"Available"] map["lastTransitionTime":"2019-08-14T13:13:05Z" "lastUpdateTime":"2019-08-14T13:13:05Z" "message":"ReplicaSet \"nginx-7dbc4d9f\" is progressing." "reason":"ReplicaSetUpdated" "status":"True" "type":"Progressing"]] "observedGeneration":'\x01' "replicas":'\x03' "unavailableReplicas":'\x03' "updatedReplicas":'\x03']]}
I0814 13:13:09.774] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
I0814 13:13:09.775] has:Error from server (Conflict)
W0814 13:13:12.235] I0814 13:13:12.235193   53127 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1565788375-26272
I0814 13:13:14.979] deployment.apps/nginx configured
I0814 13:13:15.066] Successful
I0814 13:13:15.067] message:        "name": "nginx2"
I0814 13:13:15.067]           "name": "nginx2"
I0814 13:13:15.067] has:"name": "nginx2"
W0814 13:13:15.168] I0814 13:13:14.983719   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788383-26214", Name:"nginx", UID:"01f4a22c-02f2-4ac7-a571-b6a06bd01ff4", APIVersion:"apps/v1", ResourceVersion:"611", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-594f77b9f6 to 3
W0814 13:13:15.168] I0814 13:13:14.992450   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-594f77b9f6", UID:"834e0447-3b5e-4d4f-8468-11a7701dbe68", APIVersion:"apps/v1", ResourceVersion:"612", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-594f77b9f6-x7c4r
W0814 13:13:15.169] I0814 13:13:14.996940   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-594f77b9f6", UID:"834e0447-3b5e-4d4f-8468-11a7701dbe68", APIVersion:"apps/v1", ResourceVersion:"612", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-594f77b9f6-qccgh
W0814 13:13:15.169] I0814 13:13:14.998535   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-594f77b9f6", UID:"834e0447-3b5e-4d4f-8468-11a7701dbe68", APIVersion:"apps/v1", ResourceVersion:"612", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-594f77b9f6-4xjtt
W0814 13:13:19.311] E0814 13:13:19.310624   53127 replica_set.go:450] Sync "namespace-1565788383-26214/nginx-594f77b9f6" failed with replicasets.apps "nginx-594f77b9f6" not found
W0814 13:13:20.285] I0814 13:13:20.284633   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788383-26214", Name:"nginx", UID:"7bdf9f08-3917-4e3a-9bed-d1940c340f50", APIVersion:"apps/v1", ResourceVersion:"646", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-594f77b9f6 to 3
W0814 13:13:20.291] I0814 13:13:20.290255   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-594f77b9f6", UID:"99cad50d-ed47-4652-bc8d-e28d504230a0", APIVersion:"apps/v1", ResourceVersion:"647", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-594f77b9f6-wxktz
W0814 13:13:20.295] I0814 13:13:20.294426   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-594f77b9f6", UID:"99cad50d-ed47-4652-bc8d-e28d504230a0", APIVersion:"apps/v1", ResourceVersion:"647", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-594f77b9f6-gr9j5
W0814 13:13:20.296] I0814 13:13:20.295026   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788383-26214", Name:"nginx-594f77b9f6", UID:"99cad50d-ed47-4652-bc8d-e28d504230a0", APIVersion:"apps/v1", ResourceVersion:"647", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-594f77b9f6-kbmnh
I0814 13:13:20.396] Successful
I0814 13:13:20.397] message:The Deployment "nginx" is invalid: spec.template.metadata.labels: Invalid value: map[string]string{"name":"nginx3"}: `selector` does not match template `labels`
... skipping 158 lines ...
I0814 13:13:22.115] +++ [0814 13:13:22] Creating namespace namespace-1565788402-32542
I0814 13:13:22.187] namespace/namespace-1565788402-32542 created
I0814 13:13:22.251] Context "test" modified.
I0814 13:13:22.259] +++ [0814 13:13:22] Testing kubectl get
I0814 13:13:22.338] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:22.416] (BSuccessful
I0814 13:13:22.417] message:Error from server (NotFound): pods "abc" not found
I0814 13:13:22.417] has:pods "abc" not found
I0814 13:13:22.499] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:22.580] (BSuccessful
I0814 13:13:22.580] message:Error from server (NotFound): pods "abc" not found
I0814 13:13:22.580] has:pods "abc" not found
I0814 13:13:22.659] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:22.732] (BSuccessful
I0814 13:13:22.733] message:{
I0814 13:13:22.733]     "apiVersion": "v1",
I0814 13:13:22.733]     "items": [],
... skipping 23 lines ...
I0814 13:13:23.046] has not:No resources found
I0814 13:13:23.121] Successful
I0814 13:13:23.121] message:NAME
I0814 13:13:23.121] has not:No resources found
I0814 13:13:23.204] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:23.292] (BSuccessful
I0814 13:13:23.293] message:error: the server doesn't have a resource type "foobar"
I0814 13:13:23.293] has not:No resources found
I0814 13:13:23.368] Successful
I0814 13:13:23.369] message:No resources found in namespace-1565788402-32542 namespace.
I0814 13:13:23.369] has:No resources found
I0814 13:13:23.449] Successful
I0814 13:13:23.450] message:
I0814 13:13:23.450] has not:No resources found
I0814 13:13:23.525] Successful
I0814 13:13:23.526] message:No resources found in namespace-1565788402-32542 namespace.
I0814 13:13:23.526] has:No resources found
I0814 13:13:23.607] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:23.686] (BSuccessful
I0814 13:13:23.686] message:Error from server (NotFound): pods "abc" not found
I0814 13:13:23.686] has:pods "abc" not found
I0814 13:13:23.688] FAIL!
I0814 13:13:23.688] message:Error from server (NotFound): pods "abc" not found
I0814 13:13:23.688] has not:List
I0814 13:13:23.688] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
I0814 13:13:23.788] Successful
I0814 13:13:23.789] message:I0814 13:13:23.746773   63697 loader.go:375] Config loaded from file:  /tmp/tmp.IiHxqzUzLg/.kube/config
I0814 13:13:23.789] I0814 13:13:23.748318   63697 round_trippers.go:471] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I0814 13:13:23.789] I0814 13:13:23.767633   63697 round_trippers.go:471] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 1 milliseconds
... skipping 660 lines ...
I0814 13:13:29.273] Successful
I0814 13:13:29.273] message:NAME    DATA   AGE
I0814 13:13:29.273] one     0      0s
I0814 13:13:29.273] three   0      0s
I0814 13:13:29.274] two     0      0s
I0814 13:13:29.274] STATUS    REASON          MESSAGE
I0814 13:13:29.274] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0814 13:13:29.274] has not:watch is only supported on individual resources
I0814 13:13:30.354] Successful
I0814 13:13:30.354] message:STATUS    REASON          MESSAGE
I0814 13:13:30.354] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0814 13:13:30.354] has not:watch is only supported on individual resources
I0814 13:13:30.359] +++ [0814 13:13:30] Creating namespace namespace-1565788410-4030
I0814 13:13:30.424] namespace/namespace-1565788410-4030 created
I0814 13:13:30.492] Context "test" modified.
I0814 13:13:30.574] get.sh:157: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:30.718] (Bpod/valid-pod created
... skipping 103 lines ...
I0814 13:13:30.804] }
I0814 13:13:30.878] get.sh:162: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0814 13:13:31.102] (B<no value>Successful
I0814 13:13:31.103] message:valid-pod:
I0814 13:13:31.103] has:valid-pod:
I0814 13:13:31.179] Successful
I0814 13:13:31.179] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I0814 13:13:31.179] 	template was:
I0814 13:13:31.179] 		{.missing}
I0814 13:13:31.179] 	object given to jsonpath engine was:
I0814 13:13:31.181] 		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2019-08-14T13:13:30Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fields":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl", "operation":"Update", "time":"2019-08-14T13:13:30Z"}}, "name":"valid-pod", "namespace":"namespace-1565788410-4030", "resourceVersion":"688", "uid":"67f72891-294d-49b9-bef9-d639d4aea080"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
I0814 13:13:31.181] has:missing is not found
I0814 13:13:31.256] Successful
I0814 13:13:31.256] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I0814 13:13:31.256] 	template was:
I0814 13:13:31.257] 		{{.missing}}
I0814 13:13:31.257] 	raw data was:
I0814 13:13:31.258] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-08-14T13:13:30Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fields":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2019-08-14T13:13:30Z"}],"name":"valid-pod","namespace":"namespace-1565788410-4030","resourceVersion":"688","uid":"67f72891-294d-49b9-bef9-d639d4aea080"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I0814 13:13:31.258] 	object given to template engine was:
I0814 13:13:31.259] 		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2019-08-14T13:13:30Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fields:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl operation:Update time:2019-08-14T13:13:30Z]] name:valid-pod namespace:namespace-1565788410-4030 resourceVersion:688 uid:67f72891-294d-49b9-bef9-d639d4aea080] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
I0814 13:13:31.259] has:map has no entry for key "missing"
W0814 13:13:31.360] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
I0814 13:13:32.338] Successful
I0814 13:13:32.338] message:NAME        READY   STATUS    RESTARTS   AGE
I0814 13:13:32.338] valid-pod   0/1     Pending   0          1s
I0814 13:13:32.338] STATUS      REASON          MESSAGE
I0814 13:13:32.338] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0814 13:13:32.338] has:STATUS
I0814 13:13:32.339] Successful
I0814 13:13:32.339] message:NAME        READY   STATUS    RESTARTS   AGE
I0814 13:13:32.339] valid-pod   0/1     Pending   0          1s
I0814 13:13:32.340] STATUS      REASON          MESSAGE
I0814 13:13:32.340] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0814 13:13:32.340] has:valid-pod
I0814 13:13:33.412] Successful
I0814 13:13:33.412] message:pod/valid-pod
I0814 13:13:33.412] has not:STATUS
I0814 13:13:33.414] Successful
I0814 13:13:33.415] message:pod/valid-pod
... skipping 142 lines ...
I0814 13:13:34.505] status:
I0814 13:13:34.505]   phase: Pending
I0814 13:13:34.505]   qosClass: Guaranteed
I0814 13:13:34.505] ---
I0814 13:13:34.505] has:name: valid-pod
I0814 13:13:34.577] Successful
I0814 13:13:34.578] message:Error from server (NotFound): pods "invalid-pod" not found
I0814 13:13:34.578] has:"invalid-pod" not found
I0814 13:13:34.648] pod "valid-pod" deleted
I0814 13:13:34.736] get.sh:200: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:13:34.882] (Bpod/redis-master created
I0814 13:13:34.886] pod/valid-pod created
I0814 13:13:34.976] Successful
... skipping 31 lines ...
I0814 13:13:36.027] +++ command: run_kubectl_exec_pod_tests
I0814 13:13:36.039] +++ [0814 13:13:36] Creating namespace namespace-1565788416-23976
I0814 13:13:36.109] namespace/namespace-1565788416-23976 created
I0814 13:13:36.172] Context "test" modified.
I0814 13:13:36.178] +++ [0814 13:13:36] Testing kubectl exec POD COMMAND
I0814 13:13:36.253] Successful
I0814 13:13:36.254] message:Error from server (NotFound): pods "abc" not found
I0814 13:13:36.254] has:pods "abc" not found
W0814 13:13:36.354] I0814 13:13:35.505529   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788410-4030", Name:"test-the-deployment", UID:"0f7c24d2-26ce-4dfe-9953-d77aadf10d84", APIVersion:"apps/v1", ResourceVersion:"705", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-the-deployment-55cf944b to 3
W0814 13:13:36.355] I0814 13:13:35.509018   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788410-4030", Name:"test-the-deployment-55cf944b", UID:"cc5368d1-6b81-4ea0-814c-4bbe034d6801", APIVersion:"apps/v1", ResourceVersion:"706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-55cf944b-r58qr
W0814 13:13:36.355] I0814 13:13:35.513095   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788410-4030", Name:"test-the-deployment-55cf944b", UID:"cc5368d1-6b81-4ea0-814c-4bbe034d6801", APIVersion:"apps/v1", ResourceVersion:"706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-55cf944b-2tpr8
W0814 13:13:36.356] I0814 13:13:35.513494   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788410-4030", Name:"test-the-deployment-55cf944b", UID:"cc5368d1-6b81-4ea0-814c-4bbe034d6801", APIVersion:"apps/v1", ResourceVersion:"706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-55cf944b-zr9ds
I0814 13:13:36.456] pod/test-pod created
I0814 13:13:36.508] Successful
I0814 13:13:36.508] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0814 13:13:36.508] has not:pods "test-pod" not found
I0814 13:13:36.509] Successful
I0814 13:13:36.510] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0814 13:13:36.510] has not:pod or type/name must be specified
I0814 13:13:36.579] pod "test-pod" deleted
I0814 13:13:36.598] +++ exit code: 0
I0814 13:13:36.631] Recording: run_kubectl_exec_resource_name_tests
I0814 13:13:36.631] Running command: run_kubectl_exec_resource_name_tests
I0814 13:13:36.651] 
... skipping 2 lines ...
I0814 13:13:36.658] +++ command: run_kubectl_exec_resource_name_tests
I0814 13:13:36.670] +++ [0814 13:13:36] Creating namespace namespace-1565788416-10910
I0814 13:13:36.737] namespace/namespace-1565788416-10910 created
I0814 13:13:36.802] Context "test" modified.
I0814 13:13:36.808] +++ [0814 13:13:36] Testing kubectl exec TYPE/NAME COMMAND
I0814 13:13:36.899] Successful
I0814 13:13:36.899] message:error: the server doesn't have a resource type "foo"
I0814 13:13:36.900] has:error:
I0814 13:13:36.981] Successful
I0814 13:13:36.982] message:Error from server (NotFound): deployments.apps "bar" not found
I0814 13:13:36.982] has:"bar" not found
I0814 13:13:37.141] pod/test-pod created
I0814 13:13:37.295] replicaset.apps/frontend created
W0814 13:13:37.396] I0814 13:13:37.300438   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788416-10910", Name:"frontend", UID:"919102ef-5d0b-477d-a8f5-f8e51a0dcc95", APIVersion:"apps/v1", ResourceVersion:"741", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5w9vz
W0814 13:13:37.397] I0814 13:13:37.303612   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788416-10910", Name:"frontend", UID:"919102ef-5d0b-477d-a8f5-f8e51a0dcc95", APIVersion:"apps/v1", ResourceVersion:"741", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-q5zmj
W0814 13:13:37.397] I0814 13:13:37.303942   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788416-10910", Name:"frontend", UID:"919102ef-5d0b-477d-a8f5-f8e51a0dcc95", APIVersion:"apps/v1", ResourceVersion:"741", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jjwf6
I0814 13:13:37.498] configmap/test-set-env-config created
I0814 13:13:37.529] Successful
I0814 13:13:37.530] message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
I0814 13:13:37.530] has:not implemented
I0814 13:13:37.631] Successful
I0814 13:13:37.632] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0814 13:13:37.632] has not:not found
I0814 13:13:37.634] Successful
I0814 13:13:37.634] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0814 13:13:37.634] has not:pod or type/name must be specified
I0814 13:13:37.733] Successful
I0814 13:13:37.733] message:Error from server (BadRequest): pod frontend-5w9vz does not have a host assigned
I0814 13:13:37.733] has not:not found
I0814 13:13:37.735] Successful
I0814 13:13:37.735] message:Error from server (BadRequest): pod frontend-5w9vz does not have a host assigned
I0814 13:13:37.735] has not:pod or type/name must be specified
I0814 13:13:37.805] pod "test-pod" deleted
I0814 13:13:37.880] replicaset.apps "frontend" deleted
I0814 13:13:37.956] configmap "test-set-env-config" deleted
I0814 13:13:37.973] +++ exit code: 0
I0814 13:13:38.008] Recording: run_create_secret_tests
I0814 13:13:38.008] Running command: run_create_secret_tests
I0814 13:13:38.028] 
I0814 13:13:38.030] +++ Running case: test-cmd.run_create_secret_tests 
I0814 13:13:38.032] +++ working dir: /go/src/k8s.io/kubernetes
I0814 13:13:38.034] +++ command: run_create_secret_tests
I0814 13:13:38.121] Successful
I0814 13:13:38.121] message:Error from server (NotFound): secrets "mysecret" not found
I0814 13:13:38.121] has:secrets "mysecret" not found
I0814 13:13:38.267] Successful
I0814 13:13:38.267] message:Error from server (NotFound): secrets "mysecret" not found
I0814 13:13:38.267] has:secrets "mysecret" not found
I0814 13:13:38.269] Successful
I0814 13:13:38.269] message:user-specified
I0814 13:13:38.269] has:user-specified
I0814 13:13:38.341] Successful
I0814 13:13:38.415] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"eef8043c-a262-4474-b19a-79d0524be8e8","resourceVersion":"762","creationTimestamp":"2019-08-14T13:13:38Z"}}
... skipping 2 lines ...
I0814 13:13:38.576] has:uid
I0814 13:13:38.648] Successful
I0814 13:13:38.649] message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"eef8043c-a262-4474-b19a-79d0524be8e8","resourceVersion":"763","creationTimestamp":"2019-08-14T13:13:38Z","managedFields":[{"manager":"kubectl","operation":"Update","apiVersion":"v1","time":"2019-08-14T13:13:38Z","fields":{"f:data":{"f:key1":{},".":{}}}}]},"data":{"key1":"config1"}}
I0814 13:13:38.649] has:config1
I0814 13:13:38.718] {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"eef8043c-a262-4474-b19a-79d0524be8e8"}}
I0814 13:13:38.802] Successful
I0814 13:13:38.802] message:Error from server (NotFound): configmaps "tester-update-cm" not found
I0814 13:13:38.802] has:configmaps "tester-update-cm" not found
I0814 13:13:38.814] +++ exit code: 0
I0814 13:13:38.847] Recording: run_kubectl_create_kustomization_directory_tests
I0814 13:13:38.848] Running command: run_kubectl_create_kustomization_directory_tests
I0814 13:13:38.868] 
I0814 13:13:38.870] +++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 157 lines ...
W0814 13:13:41.396] I0814 13:13:39.317213   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788416-10910", Name:"test-the-deployment-55cf944b", UID:"35d35746-6dd3-4ac3-8f9e-7d4636c7870d", APIVersion:"apps/v1", ResourceVersion:"771", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-55cf944b-wr97d
W0814 13:13:41.396] I0814 13:13:39.318788   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788416-10910", Name:"test-the-deployment-55cf944b", UID:"35d35746-6dd3-4ac3-8f9e-7d4636c7870d", APIVersion:"apps/v1", ResourceVersion:"771", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-55cf944b-dz4kk
I0814 13:13:42.373] Successful
I0814 13:13:42.374] message:NAME        READY   STATUS    RESTARTS   AGE
I0814 13:13:42.374] valid-pod   0/1     Pending   0          0s
I0814 13:13:42.374] STATUS      REASON          MESSAGE
I0814 13:13:42.374] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0814 13:13:42.374] has:Timeout exceeded while reading body
I0814 13:13:42.449] Successful
I0814 13:13:42.450] message:NAME        READY   STATUS    RESTARTS   AGE
I0814 13:13:42.450] valid-pod   0/1     Pending   0          1s
I0814 13:13:42.450] has:valid-pod
I0814 13:13:42.517] Successful
I0814 13:13:42.517] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I0814 13:13:42.518] has:Invalid timeout value
I0814 13:13:42.591] pod "valid-pod" deleted
I0814 13:13:42.609] +++ exit code: 0
I0814 13:13:42.643] Recording: run_crd_tests
I0814 13:13:42.644] Running command: run_crd_tests
I0814 13:13:42.663] 
... skipping 225 lines ...
I0814 13:13:46.848] foo.company.com/test patched
I0814 13:13:46.934] crd.sh:236: Successful get foos/test {{.patched}}: value1
I0814 13:13:47.011] (Bfoo.company.com/test patched
I0814 13:13:47.097] crd.sh:238: Successful get foos/test {{.patched}}: value2
I0814 13:13:47.180] (Bfoo.company.com/test patched
I0814 13:13:47.268] crd.sh:240: Successful get foos/test {{.patched}}: <no value>
I0814 13:13:47.415] (B+++ [0814 13:13:47] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I0814 13:13:47.477] {
I0814 13:13:47.478]     "apiVersion": "company.com/v1",
I0814 13:13:47.478]     "kind": "Foo",
I0814 13:13:47.478]     "metadata": {
I0814 13:13:47.478]         "annotations": {
I0814 13:13:47.478]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 345 lines ...
I0814 13:14:18.161] crd.sh:455: Successful get bars {{len .items}}: 1
I0814 13:14:18.234] (Bnamespace "non-native-resources" deleted
I0814 13:14:23.427] crd.sh:458: Successful get bars {{len .items}}: 0
I0814 13:14:23.587] (Bcustomresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
I0814 13:14:23.680] customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
I0814 13:14:23.773] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
W0814 13:14:23.874] Error from server (NotFound): namespaces "non-native-resources" not found
I0814 13:14:23.974] customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
I0814 13:14:23.975] +++ exit code: 0
I0814 13:14:23.975] Recording: run_cmd_with_img_tests
I0814 13:14:23.975] Running command: run_cmd_with_img_tests
I0814 13:14:23.975] 
I0814 13:14:23.975] +++ Running case: test-cmd.run_cmd_with_img_tests 
... skipping 5 lines ...
I0814 13:14:24.111] +++ [0814 13:14:24] Testing cmd with image
I0814 13:14:24.197] Successful
I0814 13:14:24.198] message:deployment.apps/test1 created
I0814 13:14:24.198] has:deployment.apps/test1 created
I0814 13:14:24.269] deployment.apps "test1" deleted
I0814 13:14:24.348] Successful
I0814 13:14:24.349] message:error: Invalid image name "InvalidImageName": invalid reference format
I0814 13:14:24.349] has:error: Invalid image name "InvalidImageName": invalid reference format
I0814 13:14:24.362] +++ exit code: 0
I0814 13:14:24.399] +++ [0814 13:14:24] Testing recursive resources
I0814 13:14:24.405] +++ [0814 13:14:24] Creating namespace namespace-1565788464-9851
I0814 13:14:24.470] namespace/namespace-1565788464-9851 created
I0814 13:14:24.537] Context "test" modified.
I0814 13:14:24.619] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:24.898] (Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:24.900] (BSuccessful
I0814 13:14:24.900] message:pod/busybox0 created
I0814 13:14:24.901] pod/busybox1 created
I0814 13:14:24.901] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0814 13:14:24.901] has:error validating data: kind not set
I0814 13:14:24.982] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:25.175] (Bgeneric-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I0814 13:14:25.178] (BSuccessful
I0814 13:14:25.178] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:25.178] has:Object 'Kind' is missing
I0814 13:14:25.264] generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:25.545] (Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0814 13:14:25.546] (BSuccessful
I0814 13:14:25.547] message:pod/busybox0 replaced
I0814 13:14:25.547] pod/busybox1 replaced
I0814 13:14:25.547] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0814 13:14:25.547] has:error validating data: kind not set
I0814 13:14:25.630] generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:25.716] (BSuccessful
I0814 13:14:25.716] message:Name:         busybox0
I0814 13:14:25.716] Namespace:    namespace-1565788464-9851
I0814 13:14:25.716] Priority:     0
I0814 13:14:25.716] Node:         <none>
... skipping 159 lines ...
I0814 13:14:25.730] has:Object 'Kind' is missing
I0814 13:14:25.805] generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:25.969] (Bgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I0814 13:14:25.972] (BSuccessful
I0814 13:14:25.972] message:pod/busybox0 annotated
I0814 13:14:25.972] pod/busybox1 annotated
I0814 13:14:25.972] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:25.972] has:Object 'Kind' is missing
I0814 13:14:26.058] generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:26.314] (Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0814 13:14:26.317] (BSuccessful
I0814 13:14:26.317] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0814 13:14:26.317] pod/busybox0 configured
I0814 13:14:26.318] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0814 13:14:26.318] pod/busybox1 configured
I0814 13:14:26.318] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0814 13:14:26.318] has:error validating data: kind not set
I0814 13:14:26.401] generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:26.538] (Bdeployment.apps/nginx created
I0814 13:14:26.632] generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0814 13:14:26.712] (Bgeneric-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0814 13:14:26.860] (Bgeneric-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
I0814 13:14:26.863] (BSuccessful
... skipping 42 lines ...
I0814 13:14:26.935] deployment.apps "nginx" deleted
I0814 13:14:27.029] generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:27.181] (Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:27.183] (BSuccessful
I0814 13:14:27.183] message:kubectl convert is DEPRECATED and will be removed in a future version.
I0814 13:14:27.183] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0814 13:14:27.183] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:27.184] has:Object 'Kind' is missing
I0814 13:14:27.269] generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:27.347] (BSuccessful
I0814 13:14:27.347] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:27.347] has:busybox0:busybox1:
I0814 13:14:27.350] Successful
I0814 13:14:27.350] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:27.350] has:Object 'Kind' is missing
I0814 13:14:27.430] generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:27.518] (Bpod/busybox0 labeled
I0814 13:14:27.518] pod/busybox1 labeled
I0814 13:14:27.519] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:27.600] generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I0814 13:14:27.602] (BSuccessful
I0814 13:14:27.602] message:pod/busybox0 labeled
I0814 13:14:27.603] pod/busybox1 labeled
I0814 13:14:27.603] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:27.604] has:Object 'Kind' is missing
I0814 13:14:27.694] generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:27.778] (Bpod/busybox0 patched
I0814 13:14:27.778] pod/busybox1 patched
I0814 13:14:27.779] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:27.862] generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I0814 13:14:27.865] (BSuccessful
I0814 13:14:27.865] message:pod/busybox0 patched
I0814 13:14:27.865] pod/busybox1 patched
I0814 13:14:27.866] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:27.866] has:Object 'Kind' is missing
I0814 13:14:27.947] generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:28.112] (Bgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:28.114] (BSuccessful
I0814 13:14:28.114] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0814 13:14:28.114] pod "busybox0" force deleted
I0814 13:14:28.114] pod "busybox1" force deleted
I0814 13:14:28.115] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0814 13:14:28.115] has:Object 'Kind' is missing
I0814 13:14:28.193] generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:28.327] (Breplicationcontroller/busybox0 created
I0814 13:14:28.331] replicationcontroller/busybox1 created
I0814 13:14:28.427] generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:28.512] (Bgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:28.592] (Bgeneric-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
I0814 13:14:28.678] (Bgeneric-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
I0814 13:14:28.843] (Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0814 13:14:28.923] (Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0814 13:14:28.925] (BSuccessful
I0814 13:14:28.925] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I0814 13:14:28.925] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I0814 13:14:28.926] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:28.926] has:Object 'Kind' is missing
I0814 13:14:28.999] horizontalpodautoscaler.autoscaling "busybox0" deleted
I0814 13:14:29.072] horizontalpodautoscaler.autoscaling "busybox1" deleted
I0814 13:14:29.157] generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:29.240] (Bgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
I0814 13:14:29.423] (Bgeneric-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
I0814 13:14:29.596] (Bgeneric-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0814 13:14:29.678] (Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0814 13:14:29.681] (BSuccessful
I0814 13:14:29.681] message:service/busybox0 exposed
I0814 13:14:29.681] service/busybox1 exposed
I0814 13:14:29.681] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:29.682] has:Object 'Kind' is missing
I0814 13:14:29.766] generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:29.846] (Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
I0814 13:14:29.929] (Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
I0814 13:14:30.109] (Bgeneric-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
I0814 13:14:30.192] (Bgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
I0814 13:14:30.194] (BSuccessful
I0814 13:14:30.194] message:replicationcontroller/busybox0 scaled
I0814 13:14:30.194] replicationcontroller/busybox1 scaled
I0814 13:14:30.194] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:30.195] has:Object 'Kind' is missing
I0814 13:14:30.279] generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:30.446] (Bgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:30.448] (BSuccessful
I0814 13:14:30.449] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0814 13:14:30.449] replicationcontroller "busybox0" force deleted
I0814 13:14:30.449] replicationcontroller "busybox1" force deleted
I0814 13:14:30.449] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:30.450] has:Object 'Kind' is missing
I0814 13:14:30.530] generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:30.677] (Bdeployment.apps/nginx1-deployment created
I0814 13:14:30.683] deployment.apps/nginx0-deployment created
W0814 13:14:30.784] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0814 13:14:30.784] I0814 13:14:24.187497   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788463-4208", Name:"test1", UID:"20a7103f-1e7e-49c5-b0ed-11c0f2783155", APIVersion:"apps/v1", ResourceVersion:"921", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-9797f89d8 to 1
W0814 13:14:30.784] I0814 13:14:24.195419   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788463-4208", Name:"test1-9797f89d8", UID:"97c4bb59-1f67-4cba-a156-57eaaeeb962a", APIVersion:"apps/v1", ResourceVersion:"922", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-9797f89d8-2sq5g
W0814 13:14:30.785] W0814 13:14:24.597941   49644 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0814 13:14:30.785] E0814 13:14:24.599633   53127 reflector.go:282] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.785] W0814 13:14:24.689933   49644 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0814 13:14:30.785] E0814 13:14:24.691080   53127 reflector.go:282] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.785] W0814 13:14:24.784895   49644 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0814 13:14:30.785] E0814 13:14:24.786659   53127 reflector.go:282] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.786] W0814 13:14:24.886208   49644 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0814 13:14:30.786] E0814 13:14:24.888028   53127 reflector.go:282] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.786] E0814 13:14:25.600936   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.786] E0814 13:14:25.692483   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.786] E0814 13:14:25.788202   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.787] E0814 13:14:25.889112   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.787] I0814 13:14:26.543009   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788464-9851", Name:"nginx", UID:"9799ba50-34e2-4856-a03d-4c2b890ab6cc", APIVersion:"apps/v1", ResourceVersion:"947", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-bbbbb95b5 to 3
W0814 13:14:30.787] I0814 13:14:26.547250   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788464-9851", Name:"nginx-bbbbb95b5", UID:"7f3d976c-4cf6-4c81-9f9b-72f7f466f55d", APIVersion:"apps/v1", ResourceVersion:"948", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-bbbbb95b5-5h5vc
W0814 13:14:30.787] I0814 13:14:26.550062   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788464-9851", Name:"nginx-bbbbb95b5", UID:"7f3d976c-4cf6-4c81-9f9b-72f7f466f55d", APIVersion:"apps/v1", ResourceVersion:"948", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-bbbbb95b5-mdhfb
W0814 13:14:30.788] I0814 13:14:26.551169   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788464-9851", Name:"nginx-bbbbb95b5", UID:"7f3d976c-4cf6-4c81-9f9b-72f7f466f55d", APIVersion:"apps/v1", ResourceVersion:"948", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-bbbbb95b5-phxhf
W0814 13:14:30.788] E0814 13:14:26.602243   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.788] E0814 13:14:26.694060   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.788] kubectl convert is DEPRECATED and will be removed in a future version.
W0814 13:14:30.788] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W0814 13:14:30.789] E0814 13:14:26.789952   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.789] E0814 13:14:26.890448   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.789] E0814 13:14:27.603617   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.789] E0814 13:14:27.695270   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.789] E0814 13:14:27.791273   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.790] E0814 13:14:27.891709   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.790] I0814 13:14:28.331171   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788464-9851", Name:"busybox0", UID:"e55a0a5a-5b71-49ad-87c7-5482510ab743", APIVersion:"v1", ResourceVersion:"978", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-w6drt
W0814 13:14:30.790] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0814 13:14:30.790] I0814 13:14:28.334998   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788464-9851", Name:"busybox1", UID:"74a1f8b6-b15e-4339-aab3-074640b93194", APIVersion:"v1", ResourceVersion:"980", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-cfjdd
W0814 13:14:30.791] E0814 13:14:28.604779   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.791] E0814 13:14:28.696773   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.791] E0814 13:14:28.792470   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.791] E0814 13:14:28.893005   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.791] E0814 13:14:29.606175   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.792] E0814 13:14:29.698214   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.792] E0814 13:14:29.793688   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.792] E0814 13:14:29.894205   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.792] I0814 13:14:30.017353   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788464-9851", Name:"busybox0", UID:"e55a0a5a-5b71-49ad-87c7-5482510ab743", APIVersion:"v1", ResourceVersion:"999", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-2v7sf
W0814 13:14:30.793] I0814 13:14:30.025565   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788464-9851", Name:"busybox1", UID:"74a1f8b6-b15e-4339-aab3-074640b93194", APIVersion:"v1", ResourceVersion:"1003", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-s4x8z
W0814 13:14:30.793] E0814 13:14:30.607661   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.793] I0814 13:14:30.680916   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788464-9851", Name:"nginx1-deployment", UID:"310c7e81-979a-4780-b8f3-9defa7833870", APIVersion:"apps/v1", ResourceVersion:"1019", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-84f7f49fb7 to 2
W0814 13:14:30.793] I0814 13:14:30.684692   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788464-9851", Name:"nginx1-deployment-84f7f49fb7", UID:"9b1a28e6-87e5-4446-99cb-836a20ec6411", APIVersion:"apps/v1", ResourceVersion:"1020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-84f7f49fb7-llm5l
W0814 13:14:30.794] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0814 13:14:30.794] I0814 13:14:30.686337   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788464-9851", Name:"nginx0-deployment", UID:"af7c361c-4b5a-4982-b076-dd21818d9a5f", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57475bf54d to 2
W0814 13:14:30.794] I0814 13:14:30.691014   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788464-9851", Name:"nginx1-deployment-84f7f49fb7", UID:"9b1a28e6-87e5-4446-99cb-836a20ec6411", APIVersion:"apps/v1", ResourceVersion:"1020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-84f7f49fb7-4dwds
W0814 13:14:30.795] I0814 13:14:30.697030   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788464-9851", Name:"nginx0-deployment-57475bf54d", UID:"3ee42d99-34eb-4d62-a1aa-b589571b4b4a", APIVersion:"apps/v1", ResourceVersion:"1025", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57475bf54d-dhjzt
W0814 13:14:30.795] E0814 13:14:30.700874   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.795] I0814 13:14:30.701017   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788464-9851", Name:"nginx0-deployment-57475bf54d", UID:"3ee42d99-34eb-4d62-a1aa-b589571b4b4a", APIVersion:"apps/v1", ResourceVersion:"1025", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57475bf54d-xxcx6
W0814 13:14:30.795] E0814 13:14:30.794916   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:30.896] E0814 13:14:30.895767   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:30.997] generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I0814 13:14:30.997] (Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0814 13:14:31.061] (Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0814 13:14:31.063] (BSuccessful
I0814 13:14:31.063] message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
I0814 13:14:31.064] deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
I0814 13:14:31.064] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0814 13:14:31.064] has:Object 'Kind' is missing
I0814 13:14:31.144] deployment.apps/nginx1-deployment paused
I0814 13:14:31.149] deployment.apps/nginx0-deployment paused
I0814 13:14:31.243] generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I0814 13:14:31.245] (BSuccessful
I0814 13:14:31.245] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
I0814 13:14:31.526] 1         <none>
I0814 13:14:31.526] 
I0814 13:14:31.527] deployment.apps/nginx0-deployment 
I0814 13:14:31.527] REVISION  CHANGE-CAUSE
I0814 13:14:31.527] 1         <none>
I0814 13:14:31.527] 
I0814 13:14:31.527] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0814 13:14:31.527] has:nginx0-deployment
I0814 13:14:31.529] Successful
I0814 13:14:31.529] message:deployment.apps/nginx1-deployment 
I0814 13:14:31.529] REVISION  CHANGE-CAUSE
I0814 13:14:31.529] 1         <none>
I0814 13:14:31.529] 
I0814 13:14:31.529] deployment.apps/nginx0-deployment 
I0814 13:14:31.529] REVISION  CHANGE-CAUSE
I0814 13:14:31.529] 1         <none>
I0814 13:14:31.529] 
I0814 13:14:31.530] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0814 13:14:31.530] has:nginx1-deployment
I0814 13:14:31.532] Successful
I0814 13:14:31.532] message:deployment.apps/nginx1-deployment 
I0814 13:14:31.532] REVISION  CHANGE-CAUSE
I0814 13:14:31.532] 1         <none>
I0814 13:14:31.532] 
I0814 13:14:31.532] deployment.apps/nginx0-deployment 
I0814 13:14:31.532] REVISION  CHANGE-CAUSE
I0814 13:14:31.532] 1         <none>
I0814 13:14:31.532] 
I0814 13:14:31.533] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0814 13:14:31.533] has:Object 'Kind' is missing
I0814 13:14:31.605] deployment.apps "nginx1-deployment" force deleted
I0814 13:14:31.609] deployment.apps "nginx0-deployment" force deleted
W0814 13:14:31.709] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0814 13:14:31.710] E0814 13:14:31.608487   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:31.710] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
W0814 13:14:31.711] E0814 13:14:31.702257   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:31.797] E0814 13:14:31.796241   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:31.897] E0814 13:14:31.896972   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:32.610] E0814 13:14:32.610111   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:32.703] E0814 13:14:32.703486   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:32.798] E0814 13:14:32.797643   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:32.853] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0814 13:14:32.854] I0814 13:14:32.853873   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788464-9851", Name:"busybox0", UID:"0a4191c0-0af4-49ba-b2c8-18c2ae45b3e3", APIVersion:"v1", ResourceVersion:"1068", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-v2d4h
W0814 13:14:32.857] I0814 13:14:32.857093   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788464-9851", Name:"busybox1", UID:"899b96fe-7bcf-4561-a328-df085de54d06", APIVersion:"v1", ResourceVersion:"1070", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-rdfjt
W0814 13:14:32.899] E0814 13:14:32.898437   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:32.999] generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:33.000] (Breplicationcontroller/busybox0 created
I0814 13:14:33.000] replicationcontroller/busybox1 created
I0814 13:14:33.000] generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0814 13:14:33.031] (BSuccessful
I0814 13:14:33.031] message:no rollbacker has been implemented for "ReplicationController"
... skipping 4 lines ...
I0814 13:14:33.034] message:no rollbacker has been implemented for "ReplicationController"
I0814 13:14:33.034] no rollbacker has been implemented for "ReplicationController"
I0814 13:14:33.035] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:33.035] has:Object 'Kind' is missing
I0814 13:14:33.122] Successful
I0814 13:14:33.123] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:33.123] error: replicationcontrollers "busybox0" pausing is not supported
I0814 13:14:33.123] error: replicationcontrollers "busybox1" pausing is not supported
I0814 13:14:33.123] has:Object 'Kind' is missing
I0814 13:14:33.125] Successful
I0814 13:14:33.125] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:33.125] error: replicationcontrollers "busybox0" pausing is not supported
I0814 13:14:33.126] error: replicationcontrollers "busybox1" pausing is not supported
I0814 13:14:33.126] has:replicationcontrollers "busybox0" pausing is not supported
I0814 13:14:33.127] Successful
I0814 13:14:33.127] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:33.127] error: replicationcontrollers "busybox0" pausing is not supported
I0814 13:14:33.128] error: replicationcontrollers "busybox1" pausing is not supported
I0814 13:14:33.128] has:replicationcontrollers "busybox1" pausing is not supported
I0814 13:14:33.215] Successful
I0814 13:14:33.216] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:33.216] error: replicationcontrollers "busybox0" resuming is not supported
I0814 13:14:33.216] error: replicationcontrollers "busybox1" resuming is not supported
I0814 13:14:33.216] has:Object 'Kind' is missing
I0814 13:14:33.217] Successful
I0814 13:14:33.218] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:33.218] error: replicationcontrollers "busybox0" resuming is not supported
I0814 13:14:33.218] error: replicationcontrollers "busybox1" resuming is not supported
I0814 13:14:33.218] has:replicationcontrollers "busybox0" resuming is not supported
I0814 13:14:33.220] Successful
I0814 13:14:33.221] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0814 13:14:33.221] error: replicationcontrollers "busybox0" resuming is not supported
I0814 13:14:33.221] error: replicationcontrollers "busybox1" resuming is not supported
I0814 13:14:33.221] has:replicationcontrollers "busybox0" resuming is not supported
I0814 13:14:33.294] replicationcontroller "busybox0" force deleted
I0814 13:14:33.299] replicationcontroller "busybox1" force deleted
W0814 13:14:33.399] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0814 13:14:33.400] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
W0814 13:14:33.612] E0814 13:14:33.612005   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:33.705] E0814 13:14:33.705011   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:33.799] E0814 13:14:33.798816   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:33.900] E0814 13:14:33.900216   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:34.306] Recording: run_namespace_tests
I0814 13:14:34.306] Running command: run_namespace_tests
I0814 13:14:34.329] 
I0814 13:14:34.330] +++ Running case: test-cmd.run_namespace_tests 
I0814 13:14:34.333] +++ working dir: /go/src/k8s.io/kubernetes
I0814 13:14:34.335] +++ command: run_namespace_tests
I0814 13:14:34.345] +++ [0814 13:14:34] Testing kubectl(v1:namespaces)
I0814 13:14:34.416] namespace/my-namespace created
I0814 13:14:34.504] core.sh:1308: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I0814 13:14:34.577] (Bnamespace "my-namespace" deleted
W0814 13:14:34.678] E0814 13:14:34.613320   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:34.707] E0814 13:14:34.706543   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:34.801] E0814 13:14:34.800482   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:34.902] E0814 13:14:34.901693   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:35.615] E0814 13:14:35.614936   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:35.708] E0814 13:14:35.708037   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:35.802] E0814 13:14:35.801744   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:35.904] E0814 13:14:35.903283   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:36.617] E0814 13:14:36.616455   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:36.710] E0814 13:14:36.709681   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:36.803] E0814 13:14:36.803198   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:36.905] E0814 13:14:36.904980   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:37.618] E0814 13:14:37.618040   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:37.712] E0814 13:14:37.711287   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:37.804] E0814 13:14:37.804240   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:37.907] E0814 13:14:37.906371   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:38.619] E0814 13:14:38.619193   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:38.713] E0814 13:14:38.712799   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:38.806] E0814 13:14:38.805558   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:38.908] E0814 13:14:38.907957   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:39.620] E0814 13:14:39.620105   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:39.715] E0814 13:14:39.714443   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:39.807] E0814 13:14:39.806653   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:39.907] namespace/my-namespace condition met
I0814 13:14:39.908] Successful
I0814 13:14:39.908] message:Error from server (NotFound): namespaces "my-namespace" not found
I0814 13:14:39.908] has: not found
I0814 13:14:39.908] namespace/my-namespace created
I0814 13:14:39.908] core.sh:1317: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I0814 13:14:40.056] (BSuccessful
I0814 13:14:40.057] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0814 13:14:40.057] namespace "kube-node-lease" deleted
... skipping 29 lines ...
I0814 13:14:40.060] namespace "namespace-1565788419-8650" deleted
I0814 13:14:40.060] namespace "namespace-1565788420-8874" deleted
I0814 13:14:40.060] namespace "namespace-1565788422-14741" deleted
I0814 13:14:40.060] namespace "namespace-1565788423-29812" deleted
I0814 13:14:40.060] namespace "namespace-1565788463-4208" deleted
I0814 13:14:40.060] namespace "namespace-1565788464-9851" deleted
I0814 13:14:40.060] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0814 13:14:40.060] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0814 13:14:40.060] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0814 13:14:40.061] has:warning: deleting cluster-scoped resources
I0814 13:14:40.061] Successful
I0814 13:14:40.061] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0814 13:14:40.061] namespace "kube-node-lease" deleted
I0814 13:14:40.061] namespace "my-namespace" deleted
I0814 13:14:40.061] namespace "namespace-1565788331-8989" deleted
... skipping 27 lines ...
I0814 13:14:40.064] namespace "namespace-1565788419-8650" deleted
I0814 13:14:40.064] namespace "namespace-1565788420-8874" deleted
I0814 13:14:40.064] namespace "namespace-1565788422-14741" deleted
I0814 13:14:40.064] namespace "namespace-1565788423-29812" deleted
I0814 13:14:40.064] namespace "namespace-1565788463-4208" deleted
I0814 13:14:40.064] namespace "namespace-1565788464-9851" deleted
I0814 13:14:40.064] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0814 13:14:40.064] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0814 13:14:40.064] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0814 13:14:40.065] has:namespace "my-namespace" deleted
I0814 13:14:40.161] core.sh:1329: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
I0814 13:14:40.232] (Bnamespace/other created
I0814 13:14:40.321] core.sh:1333: Successful get namespaces/other {{.metadata.name}}: other
I0814 13:14:40.403] (Bcore.sh:1337: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:40.557] (Bpod/valid-pod created
I0814 13:14:40.653] core.sh:1341: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0814 13:14:40.740] (Bcore.sh:1343: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0814 13:14:40.815] (BSuccessful
I0814 13:14:40.815] message:error: a resource cannot be retrieved by name across all namespaces
I0814 13:14:40.815] has:a resource cannot be retrieved by name across all namespaces
I0814 13:14:40.900] core.sh:1350: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0814 13:14:40.972] (Bpod "valid-pod" force deleted
I0814 13:14:41.059] core.sh:1354: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:41.129] (Bnamespace "other" deleted
W0814 13:14:41.230] E0814 13:14:39.908943   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.230] E0814 13:14:40.621681   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.231] E0814 13:14:40.716299   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.231] E0814 13:14:40.807873   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.231] E0814 13:14:40.910393   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.231] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0814 13:14:41.623] E0814 13:14:41.623107   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.718] E0814 13:14:41.717975   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.809] E0814 13:14:41.808659   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:41.912] E0814 13:14:41.911690   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:42.608] I0814 13:14:42.608114   53127 controller_utils.go:1029] Waiting for caches to sync for garbage collector controller
W0814 13:14:42.624] E0814 13:14:42.624088   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:42.674] I0814 13:14:42.673621   53127 controller_utils.go:1029] Waiting for caches to sync for resource quota controller
W0814 13:14:42.709] I0814 13:14:42.708492   53127 controller_utils.go:1036] Caches are synced for garbage collector controller
W0814 13:14:42.719] E0814 13:14:42.719428   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:42.774] I0814 13:14:42.773940   53127 controller_utils.go:1036] Caches are synced for resource quota controller
W0814 13:14:42.810] E0814 13:14:42.810062   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:42.913] E0814 13:14:42.913144   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:43.626] E0814 13:14:43.625687   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:43.721] E0814 13:14:43.720885   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:43.754] I0814 13:14:43.753571   53127 horizontal.go:341] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1565788464-9851
W0814 13:14:43.757] I0814 13:14:43.757262   53127 horizontal.go:341] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1565788464-9851
W0814 13:14:43.811] E0814 13:14:43.811157   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:43.915] E0814 13:14:43.914540   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:44.628] E0814 13:14:44.627483   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:44.722] E0814 13:14:44.722208   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:44.813] E0814 13:14:44.812574   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:44.916] E0814 13:14:44.916239   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:45.628] E0814 13:14:45.628339   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:45.724] E0814 13:14:45.723512   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:45.814] E0814 13:14:45.813683   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:45.918] E0814 13:14:45.917516   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:46.227] +++ exit code: 0
I0814 13:14:46.258] Recording: run_secrets_test
I0814 13:14:46.259] Running command: run_secrets_test
I0814 13:14:46.277] 
I0814 13:14:46.278] +++ Running case: test-cmd.run_secrets_test 
I0814 13:14:46.280] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 58 lines ...
I0814 13:14:48.050] (Bsecret "test-secret" deleted
I0814 13:14:48.125] secret/test-secret created
I0814 13:14:48.206] core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
I0814 13:14:48.284] (Bcore.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
I0814 13:14:48.353] (Bsecret "test-secret" deleted
W0814 13:14:48.454] I0814 13:14:46.494250   70117 loader.go:375] Config loaded from file:  /tmp/tmp.IiHxqzUzLg/.kube/config
W0814 13:14:48.454] E0814 13:14:46.629760   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:48.454] E0814 13:14:46.724739   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:48.455] E0814 13:14:46.814717   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:48.455] E0814 13:14:46.918856   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:48.455] E0814 13:14:47.631300   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:48.455] E0814 13:14:47.726138   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:48.455] E0814 13:14:47.815730   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:48.456] E0814 13:14:47.920101   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:48.556] secret/secret-string-data created
I0814 13:14:48.576] core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
I0814 13:14:48.660] (Bcore.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
I0814 13:14:48.742] (Bcore.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
I0814 13:14:48.814] (Bsecret "secret-string-data" deleted
I0814 13:14:48.904] core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:14:49.051] (Bsecret "test-secret" deleted
I0814 13:14:49.128] namespace "test-secrets" deleted
W0814 13:14:49.229] E0814 13:14:48.632759   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:49.230] E0814 13:14:48.727720   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:49.230] E0814 13:14:48.816667   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:49.230] E0814 13:14:48.921269   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:49.634] E0814 13:14:49.634154   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:49.729] E0814 13:14:49.729135   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:49.818] E0814 13:14:49.817904   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:49.922] E0814 13:14:49.922195   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:50.636] E0814 13:14:50.635684   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:50.731] E0814 13:14:50.730430   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:50.819] E0814 13:14:50.819146   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:50.924] E0814 13:14:50.923705   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:51.637] E0814 13:14:51.637164   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:51.733] E0814 13:14:51.732377   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:51.821] E0814 13:14:51.820462   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:51.925] E0814 13:14:51.925066   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:52.639] E0814 13:14:52.638739   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:52.734] E0814 13:14:52.733989   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:52.822] E0814 13:14:52.821724   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:52.927] E0814 13:14:52.926650   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:53.640] E0814 13:14:53.640212   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:53.736] E0814 13:14:53.735585   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:53.824] E0814 13:14:53.823474   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:53.928] E0814 13:14:53.928192   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:54.238] +++ exit code: 0
I0814 13:14:54.272] Recording: run_configmap_tests
I0814 13:14:54.272] Running command: run_configmap_tests
I0814 13:14:54.294] 
I0814 13:14:54.296] +++ Running case: test-cmd.run_configmap_tests 
I0814 13:14:54.299] +++ working dir: /go/src/k8s.io/kubernetes
I0814 13:14:54.301] +++ command: run_configmap_tests
I0814 13:14:54.314] +++ [0814 13:14:54] Creating namespace namespace-1565788494-7167
I0814 13:14:54.386] namespace/namespace-1565788494-7167 created
I0814 13:14:54.453] Context "test" modified.
I0814 13:14:54.459] +++ [0814 13:14:54] Testing configmaps
W0814 13:14:54.642] E0814 13:14:54.641886   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:54.738] E0814 13:14:54.737882   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:54.825] E0814 13:14:54.824484   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:14:54.926] configmap/test-configmap created
I0814 13:14:54.926] core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
I0814 13:14:54.926] (Bconfigmap "test-configmap" deleted
I0814 13:14:54.927] core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
I0814 13:14:54.986] (Bnamespace/test-configmaps created
I0814 13:14:55.074] core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
... skipping 3 lines ...
I0814 13:14:55.378] configmap/test-binary-configmap created
I0814 13:14:55.463] core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
I0814 13:14:55.548] (Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
I0814 13:14:55.770] (Bconfigmap "test-configmap" deleted
I0814 13:14:55.845] configmap "test-binary-configmap" deleted
I0814 13:14:55.919] namespace "test-configmaps" deleted
W0814 13:14:56.020] E0814 13:14:54.929595   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.021] E0814 13:14:55.643507   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.021] E0814 13:14:55.739153   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.021] E0814 13:14:55.825859   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.021] E0814 13:14:55.930708   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.645] E0814 13:14:56.645058   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.741] E0814 13:14:56.740602   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.827] E0814 13:14:56.827235   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:56.932] E0814 13:14:56.932087   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:57.647] E0814 13:14:57.646557   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:57.743] E0814 13:14:57.742363   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:57.829] E0814 13:14:57.828795   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:57.934] E0814 13:14:57.933701   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:58.648] E0814 13:14:58.648055   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:58.744] E0814 13:14:58.744056   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:58.830] E0814 13:14:58.830234   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:58.935] E0814 13:14:58.935198   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:59.650] E0814 13:14:59.649405   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:59.746] E0814 13:14:59.745744   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:59.832] E0814 13:14:59.832055   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:14:59.937] E0814 13:14:59.936613   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:00.651] E0814 13:15:00.650951   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:00.747] E0814 13:15:00.747067   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:00.834] E0814 13:15:00.833595   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:00.938] E0814 13:15:00.937568   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:01.038] +++ exit code: 0
I0814 13:15:01.056] Recording: run_client_config_tests
I0814 13:15:01.056] Running command: run_client_config_tests
I0814 13:15:01.075] 
I0814 13:15:01.077] +++ Running case: test-cmd.run_client_config_tests 
I0814 13:15:01.079] +++ working dir: /go/src/k8s.io/kubernetes
I0814 13:15:01.081] +++ command: run_client_config_tests
I0814 13:15:01.093] +++ [0814 13:15:01] Creating namespace namespace-1565788501-20126
I0814 13:15:01.162] namespace/namespace-1565788501-20126 created
I0814 13:15:01.227] Context "test" modified.
I0814 13:15:01.235] +++ [0814 13:15:01] Testing client config
I0814 13:15:01.302] Successful
I0814 13:15:01.302] message:error: stat missing: no such file or directory
I0814 13:15:01.303] has:missing: no such file or directory
I0814 13:15:01.366] Successful
I0814 13:15:01.367] message:error: stat missing: no such file or directory
I0814 13:15:01.367] has:missing: no such file or directory
I0814 13:15:01.432] Successful
I0814 13:15:01.432] message:error: stat missing: no such file or directory
I0814 13:15:01.432] has:missing: no such file or directory
I0814 13:15:01.500] Successful
I0814 13:15:01.501] message:Error in configuration: context was not found for specified context: missing-context
I0814 13:15:01.501] has:context was not found for specified context: missing-context
I0814 13:15:01.574] Successful
I0814 13:15:01.574] message:error: no server found for cluster "missing-cluster"
I0814 13:15:01.574] has:no server found for cluster "missing-cluster"
I0814 13:15:01.644] Successful
I0814 13:15:01.645] message:error: auth info "missing-user" does not exist
I0814 13:15:01.645] has:auth info "missing-user" does not exist
W0814 13:15:01.745] E0814 13:15:01.652118   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:01.749] E0814 13:15:01.748715   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:01.835] E0814 13:15:01.835047   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:01.936] Successful
I0814 13:15:01.936] message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I0814 13:15:01.936] has:error loading config file
I0814 13:15:01.937] Successful
I0814 13:15:01.937] message:error: stat missing-config: no such file or directory
I0814 13:15:01.937] has:no such file or directory
I0814 13:15:01.937] +++ exit code: 0
I0814 13:15:01.937] Recording: run_service_accounts_tests
I0814 13:15:01.937] Running command: run_service_accounts_tests
I0814 13:15:01.937] 
I0814 13:15:01.937] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 7 lines ...
I0814 13:15:02.252] (Bnamespace/test-service-accounts created
I0814 13:15:02.347] core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
I0814 13:15:02.417] (Bserviceaccount/test-service-account created
I0814 13:15:02.510] core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
I0814 13:15:02.583] (Bserviceaccount "test-service-account" deleted
I0814 13:15:02.657] namespace "test-service-accounts" deleted
W0814 13:15:02.758] E0814 13:15:01.939582   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:02.759] E0814 13:15:02.653357   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:02.759] E0814 13:15:02.750143   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:02.837] E0814 13:15:02.836460   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:02.942] E0814 13:15:02.941319   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:03.655] E0814 13:15:03.654721   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:03.752] E0814 13:15:03.751548   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:03.838] E0814 13:15:03.837806   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:03.943] E0814 13:15:03.942920   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:04.656] E0814 13:15:04.656001   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:04.753] E0814 13:15:04.753188   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:04.840] E0814 13:15:04.839291   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:04.945] E0814 13:15:04.944549   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:05.658] E0814 13:15:05.657392   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:05.755] E0814 13:15:05.754556   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:05.842] E0814 13:15:05.841031   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:05.946] E0814 13:15:05.945902   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:06.659] E0814 13:15:06.659133   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:06.756] E0814 13:15:06.756171   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:06.842] E0814 13:15:06.842316   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:06.948] E0814 13:15:06.947432   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:07.660] E0814 13:15:07.660050   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:07.757] E0814 13:15:07.757280   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:07.844] E0814 13:15:07.843578   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:07.944] +++ exit code: 0
I0814 13:15:07.945] Recording: run_job_tests
I0814 13:15:07.945] Running command: run_job_tests
I0814 13:15:07.945] 
I0814 13:15:07.945] +++ Running case: test-cmd.run_job_tests 
I0814 13:15:07.945] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 14 lines ...
I0814 13:15:08.518] Labels:                        run=pi
I0814 13:15:08.518] Annotations:                   <none>
I0814 13:15:08.518] Schedule:                      59 23 31 2 *
I0814 13:15:08.518] Concurrency Policy:            Allow
I0814 13:15:08.518] Suspend:                       False
I0814 13:15:08.518] Successful Job History Limit:  3
I0814 13:15:08.519] Failed Job History Limit:      1
I0814 13:15:08.519] Starting Deadline Seconds:     <unset>
I0814 13:15:08.519] Selector:                      <unset>
I0814 13:15:08.519] Parallelism:                   <unset>
I0814 13:15:08.519] Completions:                   <unset>
I0814 13:15:08.519] Pod Template:
I0814 13:15:08.519]   Labels:  run=pi
... skipping 32 lines ...
I0814 13:15:08.966]                 run=pi
I0814 13:15:08.967] Annotations:    cronjob.kubernetes.io/instantiate: manual
I0814 13:15:08.967] Controlled By:  CronJob/pi
I0814 13:15:08.967] Parallelism:    1
I0814 13:15:08.967] Completions:    1
I0814 13:15:08.967] Start Time:     Wed, 14 Aug 2019 13:15:08 +0000
I0814 13:15:08.967] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I0814 13:15:08.967] Pod Template:
I0814 13:15:08.967]   Labels:  controller-uid=4784183c-0bd3-425c-bf58-2b700fbc74e7
I0814 13:15:08.967]            job-name=test-job
I0814 13:15:08.967]            run=pi
I0814 13:15:08.967]   Containers:
I0814 13:15:08.968]    pi:
... skipping 15 lines ...
I0814 13:15:08.969]   Type    Reason            Age   From            Message
I0814 13:15:08.969]   ----    ------            ----  ----            -------
I0814 13:15:08.969]   Normal  SuccessfulCreate  0s    job-controller  Created pod: test-job-8s6s7
I0814 13:15:09.060] job.batch "test-job" deleted
I0814 13:15:09.131] cronjob.batch "pi" deleted
I0814 13:15:09.201] namespace "test-jobs" deleted
W0814 13:15:09.302] E0814 13:15:07.948924   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.302] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0814 13:15:09.303] E0814 13:15:08.661025   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.303] I0814 13:15:08.743423   53127 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"4784183c-0bd3-425c-bf58-2b700fbc74e7", APIVersion:"batch/v1", ResourceVersion:"1349", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-8s6s7
W0814 13:15:09.303] E0814 13:15:08.758130   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.303] E0814 13:15:08.845493   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.304] E0814 13:15:08.950021   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.663] E0814 13:15:09.662603   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.760] E0814 13:15:09.759375   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.847] E0814 13:15:09.846810   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:09.951] E0814 13:15:09.951106   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:10.664] E0814 13:15:10.663919   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:10.762] E0814 13:15:10.761291   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:10.848] E0814 13:15:10.848266   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:10.953] E0814 13:15:10.952522   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:11.665] E0814 13:15:11.665264   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:11.763] E0814 13:15:11.762550   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:11.850] E0814 13:15:11.850015   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:11.954] E0814 13:15:11.953748   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:12.667] E0814 13:15:12.666562   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:12.764] E0814 13:15:12.763383   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:12.851] E0814 13:15:12.851090   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:12.955] E0814 13:15:12.954795   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:13.670] E0814 13:15:13.669550   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:13.765] E0814 13:15:13.764706   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:13.853] E0814 13:15:13.852339   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:13.956] E0814 13:15:13.956116   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:14.298] +++ exit code: 0
I0814 13:15:14.328] Recording: run_create_job_tests
I0814 13:15:14.328] Running command: run_create_job_tests
I0814 13:15:14.346] 
I0814 13:15:14.347] +++ Running case: test-cmd.run_create_job_tests 
I0814 13:15:14.350] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 28 lines ...
I0814 13:15:15.656] core.sh:1415: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:15.799] (Bpodtemplate/nginx created
I0814 13:15:15.893] core.sh:1419: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0814 13:15:15.961] (BNAME    CONTAINERS   IMAGES   POD LABELS
I0814 13:15:15.961] nginx   nginx        nginx    name=nginx
W0814 13:15:16.062] I0814 13:15:14.559873   53127 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1565788514-16623", Name:"test-job", UID:"ec525405-0422-4e05-825c-5ca20ed3d73d", APIVersion:"batch/v1", ResourceVersion:"1366", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-zgm74
W0814 13:15:16.062] E0814 13:15:14.670752   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:16.063] E0814 13:15:14.765968   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:16.063] I0814 13:15:14.782634   53127 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1565788514-16623", Name:"test-job-pi", UID:"10a4a0a7-816e-434f-a1f1-8eeed74ee644", APIVersion:"batch/v1", ResourceVersion:"1371", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-tgpxb
W0814 13:15:16.064] E0814 13:15:14.853451   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:16.064] E0814 13:15:14.956913   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:16.064] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0814 13:15:16.065] I0814 13:15:15.104995   53127 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1565788514-16623", Name:"my-pi", UID:"7b224999-4a0d-49c2-b5bb-50f8c72181fb", APIVersion:"batch/v1", ResourceVersion:"1382", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-fszpn
W0814 13:15:16.065] E0814 13:15:15.672098   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:16.065] E0814 13:15:15.767537   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:16.065] I0814 13:15:15.797226   49644 controller.go:606] quota admission added evaluator for: podtemplates
W0814 13:15:16.066] E0814 13:15:15.854727   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:16.066] E0814 13:15:15.958291   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:16.166] core.sh:1427: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0814 13:15:16.197] (Bpodtemplate "nginx" deleted
I0814 13:15:16.287] core.sh:1431: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:16.300] (B+++ exit code: 0
I0814 13:15:16.335] Recording: run_service_tests
I0814 13:15:16.335] Running command: run_service_tests
... skipping 65 lines ...
I0814 13:15:17.139] Port:              <unset>  6379/TCP
I0814 13:15:17.139] TargetPort:        6379/TCP
I0814 13:15:17.139] Endpoints:         <none>
I0814 13:15:17.139] Session Affinity:  None
I0814 13:15:17.139] Events:            <none>
I0814 13:15:17.140] (B
W0814 13:15:17.240] E0814 13:15:16.673117   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:17.240] E0814 13:15:16.769100   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:17.241] E0814 13:15:16.855911   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:17.241] E0814 13:15:16.959454   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:17.341] Successful describe services:
I0814 13:15:17.341] Name:              kubernetes
I0814 13:15:17.341] Namespace:         default
I0814 13:15:17.342] Labels:            component=apiserver
I0814 13:15:17.342]                    provider=kubernetes
I0814 13:15:17.342] Annotations:       <none>
... skipping 236 lines ...
I0814 13:15:18.135]   selector:
I0814 13:15:18.136]     role: padawan
I0814 13:15:18.136]   sessionAffinity: None
I0814 13:15:18.136]   type: ClusterIP
I0814 13:15:18.136] status:
I0814 13:15:18.136]   loadBalancer: {}
W0814 13:15:18.237] E0814 13:15:17.674900   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:18.238] E0814 13:15:17.770475   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:18.238] E0814 13:15:17.857628   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:18.238] E0814 13:15:17.960782   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:18.238] error: you must specify resources by --filename when --local is set.
W0814 13:15:18.238] Example resource specifications include:
W0814 13:15:18.238]    '-f rsrc.yaml'
W0814 13:15:18.238]    '--filename=rsrc.json'
I0814 13:15:18.339] core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
I0814 13:15:18.430] (Bcore.sh:905: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I0814 13:15:18.508] (Bservice "redis-master" deleted
... skipping 8 lines ...
I0814 13:15:19.511] core.sh:952: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
I0814 13:15:19.589] (Bservice "redis-master" deleted
I0814 13:15:19.669] service "service-v1-test" deleted
I0814 13:15:19.761] core.sh:960: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0814 13:15:19.852] (Bcore.sh:964: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0814 13:15:20.008] (Bservice/redis-master created
W0814 13:15:20.109] E0814 13:15:18.676014   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:20.110] E0814 13:15:18.771944   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:20.110] E0814 13:15:18.859144   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:20.110] E0814 13:15:18.962577   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:20.111] E0814 13:15:19.677015   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:20.111] E0814 13:15:19.773279   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:20.112] E0814 13:15:19.860523   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:20.112] E0814 13:15:19.964270   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:20.212] service/redis-slave created
I0814 13:15:20.242] core.sh:969: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
I0814 13:15:20.323] (BSuccessful
I0814 13:15:20.324] message:NAME           RSRC
I0814 13:15:20.324] kubernetes     144
I0814 13:15:20.324] redis-master   1416
... skipping 84 lines ...
I0814 13:15:25.051] (Bapps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0814 13:15:25.139] (Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0814 13:15:25.229] (Bdaemonset.apps/bind rolled back
I0814 13:15:25.322] apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0814 13:15:25.408] (Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0814 13:15:25.504] (BSuccessful
I0814 13:15:25.504] message:error: unable to find specified revision 1000000 in history
I0814 13:15:25.505] has:unable to find specified revision
I0814 13:15:25.585] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0814 13:15:25.668] (Bapps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0814 13:15:25.762] (Bdaemonset.apps/bind rolled back
I0814 13:15:25.854] apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0814 13:15:25.944] (Bapps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 22 lines ...
I0814 13:15:27.237] Namespace:    namespace-1565788526-12452
I0814 13:15:27.237] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.237] Labels:       app=guestbook
I0814 13:15:27.238]               tier=frontend
I0814 13:15:27.238] Annotations:  <none>
I0814 13:15:27.238] Replicas:     3 current / 3 desired
I0814 13:15:27.238] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.238] Pod Template:
I0814 13:15:27.238]   Labels:  app=guestbook
I0814 13:15:27.238]            tier=frontend
I0814 13:15:27.238]   Containers:
I0814 13:15:27.238]    php-redis:
I0814 13:15:27.238]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0814 13:15:27.344] Namespace:    namespace-1565788526-12452
I0814 13:15:27.344] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.344] Labels:       app=guestbook
I0814 13:15:27.344]               tier=frontend
I0814 13:15:27.344] Annotations:  <none>
I0814 13:15:27.344] Replicas:     3 current / 3 desired
I0814 13:15:27.344] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.344] Pod Template:
I0814 13:15:27.344]   Labels:  app=guestbook
I0814 13:15:27.344]            tier=frontend
I0814 13:15:27.345]   Containers:
I0814 13:15:27.345]    php-redis:
I0814 13:15:27.345]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0814 13:15:27.446] Namespace:    namespace-1565788526-12452
I0814 13:15:27.446] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.446] Labels:       app=guestbook
I0814 13:15:27.446]               tier=frontend
I0814 13:15:27.447] Annotations:  <none>
I0814 13:15:27.447] Replicas:     3 current / 3 desired
I0814 13:15:27.447] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.447] Pod Template:
I0814 13:15:27.447]   Labels:  app=guestbook
I0814 13:15:27.447]            tier=frontend
I0814 13:15:27.447]   Containers:
I0814 13:15:27.447]    php-redis:
I0814 13:15:27.447]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
I0814 13:15:27.547] Namespace:    namespace-1565788526-12452
I0814 13:15:27.547] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.548] Labels:       app=guestbook
I0814 13:15:27.548]               tier=frontend
I0814 13:15:27.548] Annotations:  <none>
I0814 13:15:27.548] Replicas:     3 current / 3 desired
I0814 13:15:27.548] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.548] Pod Template:
I0814 13:15:27.548]   Labels:  app=guestbook
I0814 13:15:27.548]            tier=frontend
I0814 13:15:27.548]   Containers:
I0814 13:15:27.548]    php-redis:
I0814 13:15:27.548]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
I0814 13:15:27.549]   Type    Reason            Age   From                    Message
I0814 13:15:27.549]   ----    ------            ----  ----                    -------
I0814 13:15:27.549]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-dzh2z
I0814 13:15:27.550]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-qtzmx
I0814 13:15:27.550]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-crvqs
I0814 13:15:27.550] (B
W0814 13:15:27.650] E0814 13:15:20.678412   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.651] E0814 13:15:20.774563   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.651] E0814 13:15:20.862170   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.651] E0814 13:15:20.965667   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.651] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0814 13:15:27.652] I0814 13:15:21.258223   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"e108afd4-f3e7-4c8d-b94c-ca586005ff17", APIVersion:"apps/v1", ResourceVersion:"1431", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-6cdd84c77d to 2
W0814 13:15:27.652] I0814 13:15:21.265934   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-6cdd84c77d", UID:"cdaa7c99-9968-42d7-aa41-5b7de71b1c35", APIVersion:"apps/v1", ResourceVersion:"1432", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-6cdd84c77d-8qztv
W0814 13:15:27.652] I0814 13:15:21.270527   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-6cdd84c77d", UID:"cdaa7c99-9968-42d7-aa41-5b7de71b1c35", APIVersion:"apps/v1", ResourceVersion:"1432", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-6cdd84c77d-cxdpl
W0814 13:15:27.653] E0814 13:15:21.679756   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.653] E0814 13:15:21.776102   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.653] E0814 13:15:21.863641   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.653] E0814 13:15:21.967004   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.653] I0814 13:15:22.317597   49644 controller.go:606] quota admission added evaluator for: daemonsets.apps
W0814 13:15:27.653] I0814 13:15:22.328602   49644 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
W0814 13:15:27.654] E0814 13:15:22.681181   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.654] E0814 13:15:22.777335   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.654] E0814 13:15:22.864828   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.654] E0814 13:15:22.968404   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.654] E0814 13:15:23.682615   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.655] E0814 13:15:23.778629   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.655] E0814 13:15:23.866128   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.655] E0814 13:15:23.969572   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.655] E0814 13:15:24.684108   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.655] E0814 13:15:24.779742   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.656] E0814 13:15:24.867921   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.656] E0814 13:15:24.970710   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.660] E0814 13:15:25.248014   53127 daemon_controller.go:302] namespace-1565788523-9339/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1565788523-9339", SelfLink:"", UID:"1f4bc959-bb1d-4f5a-a9d5-b8c64358de81", ResourceVersion:"1497", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63701385323, loc:(*time.Location)(0x7213220)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1565788523-9339\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e2d940), Fields:(*v1.Fields)(0xc001e2d960)}, v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e2d980), Fields:(*v1.Fields)(0xc001e2d9a0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e2d9c0), Fields:(*v1.Fields)(0xc001e2d9e0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e2da00), Fields:(*v1.Fields)(0xc001e2da20)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001e2da60), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc00217fe48), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc001e5d080), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001e2daa0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc0007a83c0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc00217fe9c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
W0814 13:15:27.660] E0814 13:15:25.685693   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.660] E0814 13:15:25.780656   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.664] E0814 13:15:25.781380   53127 daemon_controller.go:302] namespace-1565788523-9339/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1565788523-9339", SelfLink:"", UID:"1f4bc959-bb1d-4f5a-a9d5-b8c64358de81", ResourceVersion:"1500", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63701385323, loc:(*time.Location)(0x7213220)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1565788523-9339\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e61a00), Fields:(*v1.Fields)(0xc001e61a40)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e61a80), Fields:(*v1.Fields)(0xc001e61ac0)}, v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e61b00), Fields:(*v1.Fields)(0xc001e61b40)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e61b80), Fields:(*v1.Fields)(0xc001e61bc0)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001e61c00), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc001f2a218), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc001e4f860), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001e61c40), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc001ede378)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc001f2a2ec)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
W0814 13:15:27.665] E0814 13:15:25.869612   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.665] E0814 13:15:25.972072   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.665] I0814 13:15:26.601525   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"248d79c6-d2de-4d58-a6ca-fc4ab2acc958", APIVersion:"v1", ResourceVersion:"1509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-d44qk
W0814 13:15:27.665] I0814 13:15:26.604676   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"248d79c6-d2de-4d58-a6ca-fc4ab2acc958", APIVersion:"v1", ResourceVersion:"1509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kbk9h
W0814 13:15:27.666] I0814 13:15:26.606560   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"248d79c6-d2de-4d58-a6ca-fc4ab2acc958", APIVersion:"v1", ResourceVersion:"1509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ppcb7
W0814 13:15:27.666] E0814 13:15:26.687318   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.666] E0814 13:15:26.782114   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.666] E0814 13:15:26.871207   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.667] E0814 13:15:26.973905   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.667] I0814 13:15:27.011142   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"f10147d0-37bd-4459-aa87-9bdb2c8cc4a6", APIVersion:"v1", ResourceVersion:"1525", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dzh2z
W0814 13:15:27.667] I0814 13:15:27.014956   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"f10147d0-37bd-4459-aa87-9bdb2c8cc4a6", APIVersion:"v1", ResourceVersion:"1525", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qtzmx
W0814 13:15:27.667] I0814 13:15:27.018614   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"f10147d0-37bd-4459-aa87-9bdb2c8cc4a6", APIVersion:"v1", ResourceVersion:"1525", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-crvqs
W0814 13:15:27.689] E0814 13:15:27.688805   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.783] E0814 13:15:27.783350   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:27.872] E0814 13:15:27.872340   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:27.973] Successful describe rc:
I0814 13:15:27.973] Name:         frontend
I0814 13:15:27.973] Namespace:    namespace-1565788526-12452
I0814 13:15:27.974] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.974] Labels:       app=guestbook
I0814 13:15:27.974]               tier=frontend
I0814 13:15:27.974] Annotations:  <none>
I0814 13:15:27.974] Replicas:     3 current / 3 desired
I0814 13:15:27.974] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.974] Pod Template:
I0814 13:15:27.974]   Labels:  app=guestbook
I0814 13:15:27.974]            tier=frontend
I0814 13:15:27.974]   Containers:
I0814 13:15:27.974]    php-redis:
I0814 13:15:27.975]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0814 13:15:27.976] Namespace:    namespace-1565788526-12452
I0814 13:15:27.976] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.976] Labels:       app=guestbook
I0814 13:15:27.976]               tier=frontend
I0814 13:15:27.976] Annotations:  <none>
I0814 13:15:27.977] Replicas:     3 current / 3 desired
I0814 13:15:27.977] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.977] Pod Template:
I0814 13:15:27.977]   Labels:  app=guestbook
I0814 13:15:27.977]            tier=frontend
I0814 13:15:27.977]   Containers:
I0814 13:15:27.977]    php-redis:
I0814 13:15:27.977]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0814 13:15:27.979] Namespace:    namespace-1565788526-12452
I0814 13:15:27.979] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.979] Labels:       app=guestbook
I0814 13:15:27.979]               tier=frontend
I0814 13:15:27.979] Annotations:  <none>
I0814 13:15:27.979] Replicas:     3 current / 3 desired
I0814 13:15:27.979] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.979] Pod Template:
I0814 13:15:27.979]   Labels:  app=guestbook
I0814 13:15:27.979]            tier=frontend
I0814 13:15:27.979]   Containers:
I0814 13:15:27.980]    php-redis:
I0814 13:15:27.980]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I0814 13:15:27.981] Namespace:    namespace-1565788526-12452
I0814 13:15:27.981] Selector:     app=guestbook,tier=frontend
I0814 13:15:27.981] Labels:       app=guestbook
I0814 13:15:27.981]               tier=frontend
I0814 13:15:27.981] Annotations:  <none>
I0814 13:15:27.981] Replicas:     3 current / 3 desired
I0814 13:15:27.981] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:27.981] Pod Template:
I0814 13:15:27.981]   Labels:  app=guestbook
I0814 13:15:27.981]            tier=frontend
I0814 13:15:27.981]   Containers:
I0814 13:15:27.981]    php-redis:
I0814 13:15:27.981]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 21 lines ...
I0814 13:15:28.632] (Breplicationcontroller/frontend scaled
I0814 13:15:28.722] core.sh:1099: Successful get rc frontend {{.spec.replicas}}: 3
I0814 13:15:28.802] (Bcore.sh:1103: Successful get rc frontend {{.spec.replicas}}: 3
I0814 13:15:28.878] (Breplicationcontroller/frontend scaled
I0814 13:15:28.966] core.sh:1107: Successful get rc frontend {{.spec.replicas}}: 2
I0814 13:15:29.038] (Breplicationcontroller "frontend" deleted
W0814 13:15:29.138] E0814 13:15:27.976213   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.139] I0814 13:15:28.147669   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"f10147d0-37bd-4459-aa87-9bdb2c8cc4a6", APIVersion:"v1", ResourceVersion:"1535", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-dzh2z
W0814 13:15:29.139] error: Expected replicas to be 3, was 2
W0814 13:15:29.140] I0814 13:15:28.636207   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"f10147d0-37bd-4459-aa87-9bdb2c8cc4a6", APIVersion:"v1", ResourceVersion:"1541", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-p7zck
W0814 13:15:29.140] E0814 13:15:28.690159   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.140] E0814 13:15:28.784591   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.141] E0814 13:15:28.873817   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.141] I0814 13:15:28.883248   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"f10147d0-37bd-4459-aa87-9bdb2c8cc4a6", APIVersion:"v1", ResourceVersion:"1546", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-p7zck
W0814 13:15:29.141] E0814 13:15:28.978090   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.199] I0814 13:15:29.198423   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-master", UID:"02409de2-0eb9-404b-88e2-c8587d348c22", APIVersion:"v1", ResourceVersion:"1557", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-kf68d
I0814 13:15:29.299] replicationcontroller/redis-master created
I0814 13:15:29.346] replicationcontroller/redis-slave created
I0814 13:15:29.437] replicationcontroller/redis-master scaled
I0814 13:15:29.440] replicationcontroller/redis-slave scaled
I0814 13:15:29.534] core.sh:1117: Successful get rc redis-master {{.spec.replicas}}: 4
... skipping 4 lines ...
W0814 13:15:29.789] I0814 13:15:29.355162   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-slave", UID:"80b319ff-0b61-4cc3-990d-230e6e573fa7", APIVersion:"v1", ResourceVersion:"1562", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-lmx9m
W0814 13:15:29.789] I0814 13:15:29.442092   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-master", UID:"02409de2-0eb9-404b-88e2-c8587d348c22", APIVersion:"v1", ResourceVersion:"1569", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-gc8ph
W0814 13:15:29.790] I0814 13:15:29.447726   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-master", UID:"02409de2-0eb9-404b-88e2-c8587d348c22", APIVersion:"v1", ResourceVersion:"1569", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-sx2v5
W0814 13:15:29.790] I0814 13:15:29.450046   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-master", UID:"02409de2-0eb9-404b-88e2-c8587d348c22", APIVersion:"v1", ResourceVersion:"1569", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-v9nhx
W0814 13:15:29.790] I0814 13:15:29.450268   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-slave", UID:"80b319ff-0b61-4cc3-990d-230e6e573fa7", APIVersion:"v1", ResourceVersion:"1571", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-v2kvm
W0814 13:15:29.791] I0814 13:15:29.456533   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-slave", UID:"80b319ff-0b61-4cc3-990d-230e6e573fa7", APIVersion:"v1", ResourceVersion:"1571", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-mtnwt
W0814 13:15:29.791] E0814 13:15:29.694257   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.791] E0814 13:15:29.786063   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.860] I0814 13:15:29.859450   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment", UID:"93bbb63d-aa47-4e48-a16f-4e05154347d2", APIVersion:"apps/v1", ResourceVersion:"1603", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-66987bfc58 to 3
W0814 13:15:29.863] I0814 13:15:29.863389   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"7ae47557-11ba-4332-a7f5-79cec9c1c0f7", APIVersion:"apps/v1", ResourceVersion:"1604", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-5x8pz
W0814 13:15:29.868] I0814 13:15:29.867802   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"7ae47557-11ba-4332-a7f5-79cec9c1c0f7", APIVersion:"apps/v1", ResourceVersion:"1604", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-x9v4h
W0814 13:15:29.868] I0814 13:15:29.868482   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"7ae47557-11ba-4332-a7f5-79cec9c1c0f7", APIVersion:"apps/v1", ResourceVersion:"1604", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-kbf5q
W0814 13:15:29.877] E0814 13:15:29.876596   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:29.950] I0814 13:15:29.950110   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment", UID:"93bbb63d-aa47-4e48-a16f-4e05154347d2", APIVersion:"apps/v1", ResourceVersion:"1618", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-66987bfc58 to 1
W0814 13:15:29.956] I0814 13:15:29.955998   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"7ae47557-11ba-4332-a7f5-79cec9c1c0f7", APIVersion:"apps/v1", ResourceVersion:"1619", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-66987bfc58-5x8pz
W0814 13:15:29.957] I0814 13:15:29.956786   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"7ae47557-11ba-4332-a7f5-79cec9c1c0f7", APIVersion:"apps/v1", ResourceVersion:"1619", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-66987bfc58-x9v4h
W0814 13:15:29.979] E0814 13:15:29.979102   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:30.080] deployment.apps/nginx-deployment created
I0814 13:15:30.081] deployment.apps/nginx-deployment scaled
I0814 13:15:30.081] core.sh:1127: Successful get deployment nginx-deployment {{.spec.replicas}}: 1
I0814 13:15:30.111] (Bdeployment.apps "nginx-deployment" deleted
I0814 13:15:30.207] Successful
I0814 13:15:30.207] message:service/expose-test-deployment exposed
I0814 13:15:30.207] has:service/expose-test-deployment exposed
I0814 13:15:30.284] service "expose-test-deployment" deleted
I0814 13:15:30.369] Successful
I0814 13:15:30.370] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0814 13:15:30.370] See 'kubectl expose -h' for help and examples
I0814 13:15:30.370] has:invalid deployment: no selectors
I0814 13:15:30.519] deployment.apps/nginx-deployment created
I0814 13:15:30.614] core.sh:1146: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
I0814 13:15:30.696] (Bservice/nginx-deployment exposed
I0814 13:15:30.785] core.sh:1150: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
I0814 13:15:30.857] (Bdeployment.apps "nginx-deployment" deleted
I0814 13:15:30.867] service "nginx-deployment" deleted
W0814 13:15:30.969] I0814 13:15:30.524729   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment", UID:"43807dea-7457-4f17-8ce3-e67539b2d77a", APIVersion:"apps/v1", ResourceVersion:"1642", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-66987bfc58 to 3
W0814 13:15:30.969] I0814 13:15:30.527296   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"af271ee8-6751-4e78-bd8a-c88ad5f13f8f", APIVersion:"apps/v1", ResourceVersion:"1643", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-mp8fg
W0814 13:15:30.970] I0814 13:15:30.530516   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"af271ee8-6751-4e78-bd8a-c88ad5f13f8f", APIVersion:"apps/v1", ResourceVersion:"1643", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-mnhfr
W0814 13:15:30.971] I0814 13:15:30.532235   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-66987bfc58", UID:"af271ee8-6751-4e78-bd8a-c88ad5f13f8f", APIVersion:"apps/v1", ResourceVersion:"1643", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-w4lzx
W0814 13:15:30.971] E0814 13:15:30.695345   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:30.971] E0814 13:15:30.787491   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:30.971] E0814 13:15:30.882190   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:30.981] E0814 13:15:30.980374   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:31.023] I0814 13:15:31.021974   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"aa870ac5-c96b-4b95-b220-4153c8855bd1", APIVersion:"v1", ResourceVersion:"1670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-s5f6k
W0814 13:15:31.026] I0814 13:15:31.025782   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"aa870ac5-c96b-4b95-b220-4153c8855bd1", APIVersion:"v1", ResourceVersion:"1670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gpl8r
W0814 13:15:31.026] I0814 13:15:31.025971   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"aa870ac5-c96b-4b95-b220-4153c8855bd1", APIVersion:"v1", ResourceVersion:"1670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dsg9q
I0814 13:15:31.127] replicationcontroller/frontend created
I0814 13:15:31.127] core.sh:1157: Successful get rc frontend {{.spec.replicas}}: 3
I0814 13:15:31.197] (Bservice/frontend exposed
... skipping 11 lines ...
I0814 13:15:32.321] service "frontend" deleted
I0814 13:15:32.328] service "frontend-2" deleted
I0814 13:15:32.335] service "frontend-3" deleted
I0814 13:15:32.342] service "frontend-4" deleted
I0814 13:15:32.349] service "frontend-5" deleted
I0814 13:15:32.443] Successful
I0814 13:15:32.443] message:error: cannot expose a Node
I0814 13:15:32.443] has:cannot expose
I0814 13:15:32.533] Successful
I0814 13:15:32.534] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I0814 13:15:32.534] has:metadata.name: Invalid value
I0814 13:15:32.622] Successful
I0814 13:15:32.622] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 7 lines ...
I0814 13:15:33.053] (Bservice "etcd-server" deleted
I0814 13:15:33.149] core.sh:1215: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
I0814 13:15:33.226] (Breplicationcontroller "frontend" deleted
I0814 13:15:33.317] core.sh:1219: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:33.401] (Bcore.sh:1223: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:33.573] (Breplicationcontroller/frontend created
W0814 13:15:33.674] E0814 13:15:31.696578   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.675] E0814 13:15:31.788912   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.676] E0814 13:15:31.883760   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.677] E0814 13:15:31.981797   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.677] E0814 13:15:32.697586   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.678] E0814 13:15:32.789821   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.679] E0814 13:15:32.885516   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.680] E0814 13:15:32.983332   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.680] I0814 13:15:33.577896   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"5501c6ec-4fa6-4b8e-bb6e-ff058e72bc68", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-7478l
W0814 13:15:33.681] I0814 13:15:33.581525   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"5501c6ec-4fa6-4b8e-bb6e-ff058e72bc68", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-vk5sw
W0814 13:15:33.681] I0814 13:15:33.582030   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"5501c6ec-4fa6-4b8e-bb6e-ff058e72bc68", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bhddp
W0814 13:15:33.700] E0814 13:15:33.699294   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.741] I0814 13:15:33.740895   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-slave", UID:"8a1c2b41-54eb-4778-963d-517a8e9f3033", APIVersion:"v1", ResourceVersion:"1743", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-f6w6b
W0814 13:15:33.746] I0814 13:15:33.745738   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"redis-slave", UID:"8a1c2b41-54eb-4778-963d-517a8e9f3033", APIVersion:"v1", ResourceVersion:"1743", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-k7mqx
W0814 13:15:33.792] E0814 13:15:33.791551   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.887] E0814 13:15:33.887141   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:33.985] E0814 13:15:33.984692   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:34.086] replicationcontroller/redis-slave created
I0814 13:15:34.086] core.sh:1228: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
I0814 13:15:34.087] (Bcore.sh:1232: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
I0814 13:15:34.087] (Breplicationcontroller "frontend" deleted
I0814 13:15:34.087] replicationcontroller "redis-slave" deleted
I0814 13:15:34.133] core.sh:1236: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 6 lines ...
I0814 13:15:34.776] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0814 13:15:34.863] core.sh:1250: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0814 13:15:34.937] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W0814 13:15:35.037] I0814 13:15:34.361535   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"9d8f1a40-b10c-4068-9af7-f590630b6dc1", APIVersion:"v1", ResourceVersion:"1763", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4jxxj
W0814 13:15:35.038] I0814 13:15:34.365163   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"9d8f1a40-b10c-4068-9af7-f590630b6dc1", APIVersion:"v1", ResourceVersion:"1763", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pbbwx
W0814 13:15:35.038] I0814 13:15:34.367623   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788526-12452", Name:"frontend", UID:"9d8f1a40-b10c-4068-9af7-f590630b6dc1", APIVersion:"v1", ResourceVersion:"1763", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9cf65
W0814 13:15:35.039] E0814 13:15:34.700523   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:35.039] E0814 13:15:34.792891   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:35.039] E0814 13:15:34.888546   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:35.039] E0814 13:15:34.986007   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:35.039] Error: required flag(s) "max" not set
W0814 13:15:35.039] 
W0814 13:15:35.039] 
W0814 13:15:35.040] Examples:
W0814 13:15:35.040]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0814 13:15:35.040]   kubectl autoscale deployment foo --min=2 --max=10
W0814 13:15:35.040]   
... skipping 54 lines ...
I0814 13:15:35.260]           limits:
I0814 13:15:35.260]             cpu: 300m
I0814 13:15:35.260]           requests:
I0814 13:15:35.260]             cpu: 300m
I0814 13:15:35.260]       terminationGracePeriodSeconds: 0
I0814 13:15:35.260] status: {}
W0814 13:15:35.360] Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
I0814 13:15:35.492] deployment.apps/nginx-deployment-resources created
I0814 13:15:35.589] core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
I0814 13:15:35.672] (Bcore.sh:1266: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0814 13:15:35.753] (Bcore.sh:1267: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0814 13:15:35.837] (Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0814 13:15:35.927] core.sh:1270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
I0814 13:15:36.008] (Bcore.sh:1271: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
I0814 13:15:36.180] (Bdeployment.apps/nginx-deployment-resources resource requirements updated
W0814 13:15:36.281] I0814 13:15:35.499661   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources", UID:"ddaa62b1-4f05-4379-8e08-b75c460a020a", APIVersion:"apps/v1", ResourceVersion:"1784", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6dbb5769d7 to 3
W0814 13:15:36.281] I0814 13:15:35.503997   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-6dbb5769d7", UID:"9b70a662-c64a-4efe-b660-4cc41590d032", APIVersion:"apps/v1", ResourceVersion:"1785", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6dbb5769d7-g8554
W0814 13:15:36.282] I0814 13:15:35.507120   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-6dbb5769d7", UID:"9b70a662-c64a-4efe-b660-4cc41590d032", APIVersion:"apps/v1", ResourceVersion:"1785", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6dbb5769d7-sdrm8
W0814 13:15:36.282] I0814 13:15:35.507548   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-6dbb5769d7", UID:"9b70a662-c64a-4efe-b660-4cc41590d032", APIVersion:"apps/v1", ResourceVersion:"1785", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6dbb5769d7-wsdbg
W0814 13:15:36.282] E0814 13:15:35.702125   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:36.282] E0814 13:15:35.794536   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:36.283] I0814 13:15:35.842203   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources", UID:"ddaa62b1-4f05-4379-8e08-b75c460a020a", APIVersion:"apps/v1", ResourceVersion:"1798", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-58d7fb85cf to 1
W0814 13:15:36.283] I0814 13:15:35.845418   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-58d7fb85cf", UID:"47efad59-0109-492b-b32a-bec0cec8ac70", APIVersion:"apps/v1", ResourceVersion:"1799", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-58d7fb85cf-7286j
W0814 13:15:36.283] E0814 13:15:35.889666   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:36.284] E0814 13:15:35.987355   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:36.284] error: unable to find container named redis
W0814 13:15:36.284] I0814 13:15:36.196196   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources", UID:"ddaa62b1-4f05-4379-8e08-b75c460a020a", APIVersion:"apps/v1", ResourceVersion:"1808", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-58d7fb85cf to 0
W0814 13:15:36.284] I0814 13:15:36.201356   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-58d7fb85cf", UID:"47efad59-0109-492b-b32a-bec0cec8ac70", APIVersion:"apps/v1", ResourceVersion:"1812", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-58d7fb85cf-7286j
W0814 13:15:36.285] I0814 13:15:36.220018   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources", UID:"ddaa62b1-4f05-4379-8e08-b75c460a020a", APIVersion:"apps/v1", ResourceVersion:"1811", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-5cd64dc74f to 1
W0814 13:15:36.285] I0814 13:15:36.228026   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-5cd64dc74f", UID:"0e68a408-64df-4d0d-bed6-ad169b42026a", APIVersion:"apps/v1", ResourceVersion:"1817", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-5cd64dc74f-trqlp
I0814 13:15:36.385] core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0814 13:15:36.386] (Bcore.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
... skipping 201 lines ...
I0814 13:15:36.827]   unavailableReplicas: 4
I0814 13:15:36.827]   updatedReplicas: 1
W0814 13:15:36.928] I0814 13:15:36.478852   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources", UID:"ddaa62b1-4f05-4379-8e08-b75c460a020a", APIVersion:"apps/v1", ResourceVersion:"1828", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-6dbb5769d7 to 2
W0814 13:15:36.929] I0814 13:15:36.484152   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-6dbb5769d7", UID:"9b70a662-c64a-4efe-b660-4cc41590d032", APIVersion:"apps/v1", ResourceVersion:"1832", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-6dbb5769d7-g8554
W0814 13:15:36.929] I0814 13:15:36.497871   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources", UID:"ddaa62b1-4f05-4379-8e08-b75c460a020a", APIVersion:"apps/v1", ResourceVersion:"1831", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-8586dd678 to 1
W0814 13:15:36.929] I0814 13:15:36.501350   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788526-12452", Name:"nginx-deployment-resources-8586dd678", UID:"b5c11975-1b75-4c33-8e5f-4ff9dda90de5", APIVersion:"apps/v1", ResourceVersion:"1839", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-8586dd678-svm7w
W0814 13:15:36.930] E0814 13:15:36.703428   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:36.930] E0814 13:15:36.795756   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:36.930] error: you must specify resources by --filename when --local is set.
W0814 13:15:36.930] Example resource specifications include:
W0814 13:15:36.930]    '-f rsrc.yaml'
W0814 13:15:36.930]    '--filename=rsrc.json'
W0814 13:15:36.930] E0814 13:15:36.891087   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:36.989] E0814 13:15:36.988669   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:37.090] core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0814 13:15:37.090] (Bcore.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I0814 13:15:37.129] (Bcore.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
I0814 13:15:37.202] (Bdeployment.apps "nginx-deployment-resources" deleted
I0814 13:15:37.222] +++ exit code: 0
I0814 13:15:37.261] Recording: run_deployment_tests
... skipping 22 lines ...
I0814 13:15:38.079] has:10
I0814 13:15:38.157] Successful
I0814 13:15:38.157] message:apps/v1
I0814 13:15:38.157] has:apps/v1
W0814 13:15:38.258] I0814 13:15:37.520349   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"test-nginx-extensions", UID:"e02097b7-7fea-4209-a0d8-683797beda90", APIVersion:"apps/v1", ResourceVersion:"1863", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-574b6dd4f9 to 1
W0814 13:15:38.258] I0814 13:15:37.526594   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"test-nginx-extensions-574b6dd4f9", UID:"19bf1b55-22b6-43b1-a26a-19a44afa3bd1", APIVersion:"apps/v1", ResourceVersion:"1864", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-574b6dd4f9-6qj5s
W0814 13:15:38.258] E0814 13:15:37.704960   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:38.259] E0814 13:15:37.797032   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:38.259] E0814 13:15:37.892426   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:38.259] I0814 13:15:37.914721   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"test-nginx-apps", UID:"2fd847d7-aabd-4818-ae6d-1e2d7dc399d8", APIVersion:"apps/v1", ResourceVersion:"1877", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-7fb7df9785 to 1
W0814 13:15:38.260] I0814 13:15:37.919283   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"test-nginx-apps-7fb7df9785", UID:"d488053d-58bc-4be3-b6d7-333c34679e54", APIVersion:"apps/v1", ResourceVersion:"1878", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-7fb7df9785-p9xjw
W0814 13:15:38.260] E0814 13:15:37.989961   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:38.360] Successful describe rs:
I0814 13:15:38.361] Name:           test-nginx-apps-7fb7df9785
I0814 13:15:38.361] Namespace:      namespace-1565788537-25904
I0814 13:15:38.361] Selector:       app=test-nginx-apps,pod-template-hash=7fb7df9785
I0814 13:15:38.361] Labels:         app=test-nginx-apps
I0814 13:15:38.361]                 pod-template-hash=7fb7df9785
I0814 13:15:38.361] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I0814 13:15:38.361]                 deployment.kubernetes.io/max-replicas: 2
I0814 13:15:38.361]                 deployment.kubernetes.io/revision: 1
I0814 13:15:38.361] Controlled By:  Deployment/test-nginx-apps
I0814 13:15:38.361] Replicas:       1 current / 1 desired
I0814 13:15:38.362] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:38.362] Pod Template:
I0814 13:15:38.362]   Labels:  app=test-nginx-apps
I0814 13:15:38.362]            pod-template-hash=7fb7df9785
I0814 13:15:38.362]   Containers:
I0814 13:15:38.362]    nginx:
I0814 13:15:38.362]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 49 lines ...
I0814 13:15:39.960] (Bapps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:40.036] (Bdeployment.apps/nginx-deployment created
I0814 13:15:40.127] apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
I0814 13:15:40.203] (Bdeployment.apps "nginx-deployment" deleted
W0814 13:15:40.304] I0814 13:15:38.654240   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-with-command", UID:"3e3589ba-2220-40d8-b4c3-c5818da2ee27", APIVersion:"apps/v1", ResourceVersion:"1892", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-78b77c48d8 to 1
W0814 13:15:40.305] I0814 13:15:38.658682   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-with-command-78b77c48d8", UID:"01b829d8-9584-4c4c-ae7e-c4da558f4f6a", APIVersion:"apps/v1", ResourceVersion:"1893", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-78b77c48d8-6pwg2
W0814 13:15:40.305] E0814 13:15:38.706094   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.305] E0814 13:15:38.798186   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.305] E0814 13:15:38.893541   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.306] E0814 13:15:38.991605   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.306] I0814 13:15:39.056976   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"deployment-with-unixuserid", UID:"245f13f8-b2d3-43f5-85ea-d446ed9165ab", APIVersion:"apps/v1", ResourceVersion:"1906", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-6f4d54669 to 1
W0814 13:15:40.306] I0814 13:15:39.061021   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"deployment-with-unixuserid-6f4d54669", UID:"c3236e78-84de-4a85-a27c-0107777e2722", APIVersion:"apps/v1", ResourceVersion:"1907", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-6f4d54669-cbmmq
W0814 13:15:40.307] I0814 13:15:39.452449   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"a3bf8221-87df-4385-9008-9a61095307ba", APIVersion:"apps/v1", ResourceVersion:"1920", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-66987bfc58 to 3
W0814 13:15:40.307] I0814 13:15:39.455897   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-66987bfc58", UID:"6cbfe506-c2fe-4104-8b48-ac62dbbdc752", APIVersion:"apps/v1", ResourceVersion:"1921", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-bztx4
W0814 13:15:40.307] I0814 13:15:39.460422   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-66987bfc58", UID:"6cbfe506-c2fe-4104-8b48-ac62dbbdc752", APIVersion:"apps/v1", ResourceVersion:"1921", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-2kczb
W0814 13:15:40.308] I0814 13:15:39.461033   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-66987bfc58", UID:"6cbfe506-c2fe-4104-8b48-ac62dbbdc752", APIVersion:"apps/v1", ResourceVersion:"1921", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-hhh9d
W0814 13:15:40.308] E0814 13:15:39.707253   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.308] E0814 13:15:39.799160   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.308] E0814 13:15:39.894626   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.309] E0814 13:15:39.992987   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:40.309] I0814 13:15:40.040642   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"3c0fd3d5-41b6-4dcb-bddd-d1910f0bc77c", APIVersion:"apps/v1", ResourceVersion:"1943", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6fd788478f to 1
W0814 13:15:40.309] I0814 13:15:40.045176   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-6fd788478f", UID:"c41ef201-38e3-40db-8e25-8cacc2469dc8", APIVersion:"apps/v1", ResourceVersion:"1944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6fd788478f-sr5wb
I0814 13:15:40.410] apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:40.410] (Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
I0814 13:15:40.555] (Breplicaset.apps "nginx-deployment-6fd788478f" deleted
I0814 13:15:40.641] apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 11 lines ...
I0814 13:15:41.805] apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0814 13:15:41.955] (Bdeployment.apps/nginx configured
I0814 13:15:42.051] apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0814 13:15:42.140] (B    Image:	k8s.gcr.io/nginx:test-cmd
I0814 13:15:42.221] apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0814 13:15:42.313] (Bdeployment.apps/nginx rolled back
W0814 13:15:42.413] E0814 13:15:40.708789   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.414] I0814 13:15:40.793191   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"8d065911-1de6-4611-92f6-1461b2c17125", APIVersion:"apps/v1", ResourceVersion:"1961", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-66987bfc58 to 3
W0814 13:15:42.414] I0814 13:15:40.797039   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-66987bfc58", UID:"755dfe01-3a75-432f-847c-d9bb5c53b4f6", APIVersion:"apps/v1", ResourceVersion:"1962", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-hrt4g
W0814 13:15:42.415] I0814 13:15:40.799656   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-66987bfc58", UID:"755dfe01-3a75-432f-847c-d9bb5c53b4f6", APIVersion:"apps/v1", ResourceVersion:"1962", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-fml9w
W0814 13:15:42.415] E0814 13:15:40.800759   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.415] I0814 13:15:40.802276   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-66987bfc58", UID:"755dfe01-3a75-432f-847c-d9bb5c53b4f6", APIVersion:"apps/v1", ResourceVersion:"1962", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-66987bfc58-m5p2k
W0814 13:15:42.415] E0814 13:15:40.895810   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.416] E0814 13:15:40.994252   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.416] I0814 13:15:41.457271   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx", UID:"3f7373fc-69b5-41b3-a113-7da8dda8baea", APIVersion:"apps/v1", ResourceVersion:"1985", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-bbbbb95b5 to 3
W0814 13:15:42.416] I0814 13:15:41.460796   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-bbbbb95b5", UID:"695a7c54-cd70-45cc-88a3-de4ae1233030", APIVersion:"apps/v1", ResourceVersion:"1986", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-bbbbb95b5-sq57t
W0814 13:15:42.417] I0814 13:15:41.463197   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-bbbbb95b5", UID:"695a7c54-cd70-45cc-88a3-de4ae1233030", APIVersion:"apps/v1", ResourceVersion:"1986", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-bbbbb95b5-6cl6n
W0814 13:15:42.417] I0814 13:15:41.465110   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-bbbbb95b5", UID:"695a7c54-cd70-45cc-88a3-de4ae1233030", APIVersion:"apps/v1", ResourceVersion:"1986", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-bbbbb95b5-5lqqr
W0814 13:15:42.417] E0814 13:15:41.709998   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.417] E0814 13:15:41.802024   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.418] E0814 13:15:41.897176   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.418] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
W0814 13:15:42.418] I0814 13:15:41.959492   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx", UID:"3f7373fc-69b5-41b3-a113-7da8dda8baea", APIVersion:"apps/v1", ResourceVersion:"1999", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-56b84d547f to 1
W0814 13:15:42.418] I0814 13:15:41.965480   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-56b84d547f", UID:"6100ceb5-b268-498f-a414-d1c1b72c2d8f", APIVersion:"apps/v1", ResourceVersion:"2000", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-56b84d547f-ng22l
W0814 13:15:42.419] E0814 13:15:41.995451   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.712] E0814 13:15:42.711498   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.804] E0814 13:15:42.803468   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.899] E0814 13:15:42.898351   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:42.998] E0814 13:15:42.997310   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:43.408] apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0814 13:15:43.584] (Bapps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0814 13:15:43.674] (Bdeployment.apps/nginx rolled back
W0814 13:15:43.775] error: unable to find specified revision 1000000 in history
W0814 13:15:43.775] E0814 13:15:43.712504   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:43.805] E0814 13:15:43.804790   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:43.900] E0814 13:15:43.900103   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:43.999] E0814 13:15:43.998562   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:44.714] E0814 13:15:44.713995   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:44.806] E0814 13:15:44.806109   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:44.901] E0814 13:15:44.901193   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:44.950] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
W0814 13:15:45.000] E0814 13:15:44.999770   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:45.034] error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
I0814 13:15:45.134] apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0814 13:15:45.135] (Bdeployment.apps/nginx paused
I0814 13:15:45.135] deployment.apps/nginx resumed
I0814 13:15:45.230] deployment.apps/nginx rolled back
I0814 13:15:45.424]     deployment.kubernetes.io/revision-history: 1,3
W0814 13:15:45.607] error: desired revision (3) is different from the running revision (5)
W0814 13:15:45.707] I0814 13:15:45.707113   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx", UID:"3f7373fc-69b5-41b3-a113-7da8dda8baea", APIVersion:"apps/v1", ResourceVersion:"2030", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-56b84d547f to 0
W0814 13:15:45.712] I0814 13:15:45.711668   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-56b84d547f", UID:"6100ceb5-b268-498f-a414-d1c1b72c2d8f", APIVersion:"apps/v1", ResourceVersion:"2034", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-56b84d547f-ng22l
W0814 13:15:45.715] E0814 13:15:45.715081   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:45.718] I0814 13:15:45.717724   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx", UID:"3f7373fc-69b5-41b3-a113-7da8dda8baea", APIVersion:"apps/v1", ResourceVersion:"2033", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-6bf6759bb4 to 1
W0814 13:15:45.723] I0814 13:15:45.722474   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-6bf6759bb4", UID:"f63ada88-5bc1-4f4f-b63d-dc6f87acf1c0", APIVersion:"apps/v1", ResourceVersion:"2040", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-6bf6759bb4-6gsfd
W0814 13:15:45.808] E0814 13:15:45.807682   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:45.903] E0814 13:15:45.902952   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:46.002] E0814 13:15:46.001913   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:46.103] deployment.apps/nginx restarted
W0814 13:15:46.717] E0814 13:15:46.716385   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:46.809] E0814 13:15:46.808884   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:46.904] E0814 13:15:46.904254   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:47.003] E0814 13:15:47.003149   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:47.020] I0814 13:15:47.019539   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx2", UID:"b88bce84-45b5-4a43-b594-57ca3b81ebdc", APIVersion:"apps/v1", ResourceVersion:"2051", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-659456565 to 3
W0814 13:15:47.025] I0814 13:15:47.024530   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx2-659456565", UID:"2d9f78e0-bc47-4d9b-b258-09a7763fece2", APIVersion:"apps/v1", ResourceVersion:"2052", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-659456565-hwdfw
W0814 13:15:47.029] I0814 13:15:47.028208   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx2-659456565", UID:"2d9f78e0-bc47-4d9b-b258-09a7763fece2", APIVersion:"apps/v1", ResourceVersion:"2052", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-659456565-xjqcx
W0814 13:15:47.032] I0814 13:15:47.031402   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx2-659456565", UID:"2d9f78e0-bc47-4d9b-b258-09a7763fece2", APIVersion:"apps/v1", ResourceVersion:"2052", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-659456565-29w8c
I0814 13:15:47.132] Successful
I0814 13:15:47.133] message:apiVersion: apps/v1
... skipping 149 lines ...
I0814 13:15:49.482] apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:49.618] (Bdeployment.apps/nginx-deployment created
W0814 13:15:49.719] I0814 13:15:47.414591   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"e6ad219e-d021-4383-8547-9e988ba5a7ae", APIVersion:"apps/v1", ResourceVersion:"2085", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-75547f8f9b to 3
W0814 13:15:49.720] I0814 13:15:47.418705   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"f5f6699d-d92d-4ce1-b985-1484aa195f28", APIVersion:"apps/v1", ResourceVersion:"2086", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-75547f8f9b-x7bz7
W0814 13:15:49.720] I0814 13:15:47.420854   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"f5f6699d-d92d-4ce1-b985-1484aa195f28", APIVersion:"apps/v1", ResourceVersion:"2086", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-75547f8f9b-crwvj
W0814 13:15:49.721] I0814 13:15:47.421516   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"f5f6699d-d92d-4ce1-b985-1484aa195f28", APIVersion:"apps/v1", ResourceVersion:"2086", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-75547f8f9b-8xfqb
W0814 13:15:49.721] E0814 13:15:47.717891   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.721] I0814 13:15:47.770284   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"e6ad219e-d021-4383-8547-9e988ba5a7ae", APIVersion:"apps/v1", ResourceVersion:"2099", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6ccccffbb4 to 1
W0814 13:15:49.721] I0814 13:15:47.773964   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-6ccccffbb4", UID:"ad514a23-ccf9-4d78-90dc-b2b19b63a2fc", APIVersion:"apps/v1", ResourceVersion:"2100", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6ccccffbb4-qjzpz
W0814 13:15:49.722] E0814 13:15:47.810278   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.722] E0814 13:15:47.905624   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.722] E0814 13:15:48.004667   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.722] error: unable to find container named "redis"
W0814 13:15:49.722] E0814 13:15:48.719274   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.723] E0814 13:15:48.811392   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.723] I0814 13:15:48.892057   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"e6ad219e-d021-4383-8547-9e988ba5a7ae", APIVersion:"apps/v1", ResourceVersion:"2118", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6ccccffbb4 to 0
W0814 13:15:49.723] I0814 13:15:48.896289   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-6ccccffbb4", UID:"ad514a23-ccf9-4d78-90dc-b2b19b63a2fc", APIVersion:"apps/v1", ResourceVersion:"2122", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6ccccffbb4-qjzpz
W0814 13:15:49.723] E0814 13:15:48.906851   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.724] I0814 13:15:48.912446   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"e6ad219e-d021-4383-8547-9e988ba5a7ae", APIVersion:"apps/v1", ResourceVersion:"2120", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6596c84b98 to 1
W0814 13:15:49.724] I0814 13:15:48.916230   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-6596c84b98", UID:"5b783804-2a6c-4138-bb59-edf5b18630e4", APIVersion:"apps/v1", ResourceVersion:"2129", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6596c84b98-s5mtl
W0814 13:15:49.724] E0814 13:15:49.006131   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.724] I0814 13:15:49.526683   53127 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1565788526-12452
W0814 13:15:49.725] I0814 13:15:49.622867   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2150", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-75547f8f9b to 3
W0814 13:15:49.725] I0814 13:15:49.627067   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"3e66b3dc-84be-4d0d-a468-c2474d1199b9", APIVersion:"apps/v1", ResourceVersion:"2151", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-75547f8f9b-c67sn
W0814 13:15:49.725] I0814 13:15:49.630649   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"3e66b3dc-84be-4d0d-a468-c2474d1199b9", APIVersion:"apps/v1", ResourceVersion:"2151", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-75547f8f9b-pk8bk
W0814 13:15:49.726] I0814 13:15:49.632167   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"3e66b3dc-84be-4d0d-a468-c2474d1199b9", APIVersion:"apps/v1", ResourceVersion:"2151", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-75547f8f9b-4xvtg
W0814 13:15:49.726] E0814 13:15:49.720651   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.813] E0814 13:15:49.812665   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:49.909] E0814 13:15:49.908422   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:50.007] E0814 13:15:50.007088   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:50.108] configmap/test-set-env-config created
I0814 13:15:50.108] secret/test-set-env-secret created
I0814 13:15:50.108] apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
I0814 13:15:50.108] (Bapps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
I0814 13:15:50.183] (Bapps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
I0814 13:15:50.278] (Bdeployment.apps/nginx-deployment env updated
... skipping 5 lines ...
W0814 13:15:50.828] I0814 13:15:50.283369   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2167", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-586b69f9d8 to 1
W0814 13:15:50.829] I0814 13:15:50.288229   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-586b69f9d8", UID:"4bc00cf5-7348-4526-9573-2320e782b1e8", APIVersion:"apps/v1", ResourceVersion:"2168", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-586b69f9d8-72zwf
W0814 13:15:50.830] I0814 13:15:50.557100   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2177", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-75547f8f9b to 2
W0814 13:15:50.830] I0814 13:15:50.561990   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"3e66b3dc-84be-4d0d-a468-c2474d1199b9", APIVersion:"apps/v1", ResourceVersion:"2181", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-75547f8f9b-c67sn
W0814 13:15:50.831] I0814 13:15:50.573969   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2180", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-f48fcfc54 to 1
W0814 13:15:50.831] I0814 13:15:50.578388   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-f48fcfc54", UID:"0ef41036-5e2b-4f99-80ef-c2b0fcd800cd", APIVersion:"apps/v1", ResourceVersion:"2188", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-f48fcfc54-5wptq
W0814 13:15:50.832] E0814 13:15:50.722160   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:50.832] I0814 13:15:50.743911   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2197", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-75547f8f9b to 1
W0814 13:15:50.832] I0814 13:15:50.748968   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"3e66b3dc-84be-4d0d-a468-c2474d1199b9", APIVersion:"apps/v1", ResourceVersion:"2201", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-75547f8f9b-4xvtg
W0814 13:15:50.833] I0814 13:15:50.756058   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2200", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7dc985db9b to 1
W0814 13:15:50.833] I0814 13:15:50.767197   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-7dc985db9b", UID:"647a87c5-f587-493c-a4e8-8423ef71248e", APIVersion:"apps/v1", ResourceVersion:"2207", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7dc985db9b-4nql6
W0814 13:15:50.834] E0814 13:15:50.813402   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:50.873] I0814 13:15:50.873063   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2218", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-75547f8f9b to 0
W0814 13:15:50.878] I0814 13:15:50.878216   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-75547f8f9b", UID:"3e66b3dc-84be-4d0d-a468-c2474d1199b9", APIVersion:"apps/v1", ResourceVersion:"2222", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-75547f8f9b-pk8bk
W0814 13:15:50.887] I0814 13:15:50.886696   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment", UID:"1e93733e-2aa2-4da7-a1c1-5fc809e032a4", APIVersion:"apps/v1", ResourceVersion:"2221", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5f49cc9799 to 1
W0814 13:15:50.891] I0814 13:15:50.890618   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788537-25904", Name:"nginx-deployment-5f49cc9799", UID:"4ca8a2bf-ccbe-405e-bd13-21b4a954a9c7", APIVersion:"apps/v1", ResourceVersion:"2228", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5f49cc9799-dbwc4
W0814 13:15:50.910] E0814 13:15:50.909512   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:51.009] E0814 13:15:51.008469   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:51.109] deployment.apps/nginx-deployment env updated
I0814 13:15:51.110] deployment.apps/nginx-deployment env updated
I0814 13:15:51.110] deployment.apps/nginx-deployment env updated
I0814 13:15:51.149] deployment.apps/nginx-deployment env updated
I0814 13:15:51.227] deployment.apps "nginx-deployment" deleted
I0814 13:15:51.316] configmap "test-set-env-config" deleted
... skipping 33 lines ...
I0814 13:15:53.221] Namespace:    namespace-1565788551-20201
I0814 13:15:53.221] Selector:     app=guestbook,tier=frontend
I0814 13:15:53.222] Labels:       app=guestbook
I0814 13:15:53.222]               tier=frontend
I0814 13:15:53.222] Annotations:  <none>
I0814 13:15:53.222] Replicas:     3 current / 3 desired
I0814 13:15:53.223] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:53.223] Pod Template:
I0814 13:15:53.223]   Labels:  app=guestbook
I0814 13:15:53.223]            tier=frontend
I0814 13:15:53.224]   Containers:
I0814 13:15:53.224]    php-redis:
I0814 13:15:53.224]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0814 13:15:53.324] Namespace:    namespace-1565788551-20201
I0814 13:15:53.324] Selector:     app=guestbook,tier=frontend
I0814 13:15:53.325] Labels:       app=guestbook
I0814 13:15:53.325]               tier=frontend
I0814 13:15:53.325] Annotations:  <none>
I0814 13:15:53.325] Replicas:     3 current / 3 desired
I0814 13:15:53.325] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:53.325] Pod Template:
I0814 13:15:53.325]   Labels:  app=guestbook
I0814 13:15:53.325]            tier=frontend
I0814 13:15:53.325]   Containers:
I0814 13:15:53.325]    php-redis:
I0814 13:15:53.325]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I0814 13:15:53.422] Namespace:    namespace-1565788551-20201
I0814 13:15:53.422] Selector:     app=guestbook,tier=frontend
I0814 13:15:53.422] Labels:       app=guestbook
I0814 13:15:53.422]               tier=frontend
I0814 13:15:53.422] Annotations:  <none>
I0814 13:15:53.422] Replicas:     3 current / 3 desired
I0814 13:15:53.422] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:53.422] Pod Template:
I0814 13:15:53.422]   Labels:  app=guestbook
I0814 13:15:53.422]            tier=frontend
I0814 13:15:53.422]   Containers:
I0814 13:15:53.423]    php-redis:
I0814 13:15:53.423]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I0814 13:15:53.520] Namespace:    namespace-1565788551-20201
I0814 13:15:53.520] Selector:     app=guestbook,tier=frontend
I0814 13:15:53.520] Labels:       app=guestbook
I0814 13:15:53.520]               tier=frontend
I0814 13:15:53.520] Annotations:  <none>
I0814 13:15:53.521] Replicas:     3 current / 3 desired
I0814 13:15:53.521] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:53.521] Pod Template:
I0814 13:15:53.521]   Labels:  app=guestbook
I0814 13:15:53.521]            tier=frontend
I0814 13:15:53.521]   Containers:
I0814 13:15:53.521]    php-redis:
I0814 13:15:53.521]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 10 lines ...
I0814 13:15:53.522]   Type    Reason            Age   From                   Message
I0814 13:15:53.522]   ----    ------            ----  ----                   -------
I0814 13:15:53.522]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-lxx6q
I0814 13:15:53.523]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-m24wj
I0814 13:15:53.523]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-27lg4
I0814 13:15:53.523] (B
W0814 13:15:53.623] E0814 13:15:51.349795   53127 replica_set.go:450] Sync "namespace-1565788537-25904/nginx-deployment-5f49cc9799" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-5f49cc9799": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1565788537-25904/nginx-deployment-5f49cc9799, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 4ca8a2bf-ccbe-405e-bd13-21b4a954a9c7, UID in object meta: 
W0814 13:15:53.624] E0814 13:15:51.449622   53127 replica_set.go:450] Sync "namespace-1565788537-25904/nginx-deployment-75547f8f9b" failed with replicasets.apps "nginx-deployment-75547f8f9b" not found
W0814 13:15:53.624] E0814 13:15:51.504309   53127 replica_set.go:450] Sync "namespace-1565788537-25904/nginx-deployment-6578548995" failed with replicasets.apps "nginx-deployment-6578548995" not found
W0814 13:15:53.624] E0814 13:15:51.549091   53127 replica_set.go:450] Sync "namespace-1565788537-25904/nginx-deployment-bb4c6849" failed with replicasets.apps "nginx-deployment-bb4c6849" not found
W0814 13:15:53.625] E0814 13:15:51.723386   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.625] E0814 13:15:51.815041   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.625] I0814 13:15:51.862136   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"74f05d7b-df0c-42dc-8a73-97e771977be8", APIVersion:"apps/v1", ResourceVersion:"2263", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pvvj4
W0814 13:15:53.625] I0814 13:15:51.866006   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"74f05d7b-df0c-42dc-8a73-97e771977be8", APIVersion:"apps/v1", ResourceVersion:"2263", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-89mm9
W0814 13:15:53.626] I0814 13:15:51.866136   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"74f05d7b-df0c-42dc-8a73-97e771977be8", APIVersion:"apps/v1", ResourceVersion:"2263", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-j8jb5
W0814 13:15:53.626] E0814 13:15:51.911418   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.626] E0814 13:15:51.998731   53127 replica_set.go:450] Sync "namespace-1565788551-20201/frontend" failed with replicasets.apps "frontend" not found
W0814 13:15:53.626] E0814 13:15:52.010067   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.626] I0814 13:15:52.260697   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend-no-cascade", UID:"24c22047-0ecf-45fc-92e6-5870aa4349cd", APIVersion:"apps/v1", ResourceVersion:"2279", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-9q5nm
W0814 13:15:53.627] I0814 13:15:52.263710   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend-no-cascade", UID:"24c22047-0ecf-45fc-92e6-5870aa4349cd", APIVersion:"apps/v1", ResourceVersion:"2279", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-mbznx
W0814 13:15:53.627] I0814 13:15:52.264395   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend-no-cascade", UID:"24c22047-0ecf-45fc-92e6-5870aa4349cd", APIVersion:"apps/v1", ResourceVersion:"2279", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-rcl4r
W0814 13:15:53.627] E0814 13:15:52.449083   53127 replica_set.go:450] Sync "namespace-1565788551-20201/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
W0814 13:15:53.628] E0814 13:15:52.724652   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.628] E0814 13:15:52.816253   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.628] E0814 13:15:52.912809   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.628] I0814 13:15:53.006356   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"6978dd76-a514-490e-bdb3-47701b3a2bb5", APIVersion:"apps/v1", ResourceVersion:"2298", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lxx6q
W0814 13:15:53.629] I0814 13:15:53.009399   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"6978dd76-a514-490e-bdb3-47701b3a2bb5", APIVersion:"apps/v1", ResourceVersion:"2298", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-m24wj
W0814 13:15:53.629] I0814 13:15:53.010102   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"6978dd76-a514-490e-bdb3-47701b3a2bb5", APIVersion:"apps/v1", ResourceVersion:"2298", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-27lg4
W0814 13:15:53.629] E0814 13:15:53.010935   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.726] E0814 13:15:53.725896   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.818] E0814 13:15:53.817625   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:53.914] E0814 13:15:53.914066   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:54.013] E0814 13:15:54.011958   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:54.113] Successful describe rs:
I0814 13:15:54.113] Name:         frontend
I0814 13:15:54.114] Namespace:    namespace-1565788551-20201
I0814 13:15:54.114] Selector:     app=guestbook,tier=frontend
I0814 13:15:54.114] Labels:       app=guestbook
I0814 13:15:54.114]               tier=frontend
I0814 13:15:54.114] Annotations:  <none>
I0814 13:15:54.114] Replicas:     3 current / 3 desired
I0814 13:15:54.114] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:54.114] Pod Template:
I0814 13:15:54.114]   Labels:  app=guestbook
I0814 13:15:54.115]            tier=frontend
I0814 13:15:54.115]   Containers:
I0814 13:15:54.115]    php-redis:
I0814 13:15:54.115]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0814 13:15:54.117] Namespace:    namespace-1565788551-20201
I0814 13:15:54.117] Selector:     app=guestbook,tier=frontend
I0814 13:15:54.117] Labels:       app=guestbook
I0814 13:15:54.117]               tier=frontend
I0814 13:15:54.117] Annotations:  <none>
I0814 13:15:54.117] Replicas:     3 current / 3 desired
I0814 13:15:54.117] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:54.117] Pod Template:
I0814 13:15:54.117]   Labels:  app=guestbook
I0814 13:15:54.117]            tier=frontend
I0814 13:15:54.118]   Containers:
I0814 13:15:54.118]    php-redis:
I0814 13:15:54.118]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0814 13:15:54.119] Namespace:    namespace-1565788551-20201
I0814 13:15:54.120] Selector:     app=guestbook,tier=frontend
I0814 13:15:54.120] Labels:       app=guestbook
I0814 13:15:54.120]               tier=frontend
I0814 13:15:54.120] Annotations:  <none>
I0814 13:15:54.120] Replicas:     3 current / 3 desired
I0814 13:15:54.120] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:54.120] Pod Template:
I0814 13:15:54.120]   Labels:  app=guestbook
I0814 13:15:54.120]            tier=frontend
I0814 13:15:54.120]   Containers:
I0814 13:15:54.120]    php-redis:
I0814 13:15:54.121]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I0814 13:15:54.122] Namespace:    namespace-1565788551-20201
I0814 13:15:54.122] Selector:     app=guestbook,tier=frontend
I0814 13:15:54.122] Labels:       app=guestbook
I0814 13:15:54.122]               tier=frontend
I0814 13:15:54.122] Annotations:  <none>
I0814 13:15:54.122] Replicas:     3 current / 3 desired
I0814 13:15:54.122] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0814 13:15:54.122] Pod Template:
I0814 13:15:54.122]   Labels:  app=guestbook
I0814 13:15:54.122]            tier=frontend
I0814 13:15:54.123]   Containers:
I0814 13:15:54.123]    php-redis:
I0814 13:15:54.123]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 104 lines ...
I0814 13:15:54.497] (Bdeployment.apps/scale-1 created
W0814 13:15:54.598] I0814 13:15:54.262667   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"6978dd76-a514-490e-bdb3-47701b3a2bb5", APIVersion:"apps/v1", ResourceVersion:"2308", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-lxx6q
W0814 13:15:54.599] I0814 13:15:54.502031   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788551-20201", Name:"scale-1", UID:"46ecb6b2-87ad-40c5-989f-8053ade8bde0", APIVersion:"apps/v1", ResourceVersion:"2314", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-564ccc68b9 to 1
W0814 13:15:54.599] I0814 13:15:54.505676   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"scale-1-564ccc68b9", UID:"15cea090-d0ea-4df2-8353-944e1bc5b219", APIVersion:"apps/v1", ResourceVersion:"2315", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-564ccc68b9-4k2df
W0814 13:15:54.659] I0814 13:15:54.658977   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788551-20201", Name:"scale-2", UID:"61e173da-5e93-4ff4-a13f-69748350ea86", APIVersion:"apps/v1", ResourceVersion:"2324", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-564ccc68b9 to 1
W0814 13:15:54.662] I0814 13:15:54.661954   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"scale-2-564ccc68b9", UID:"08df4138-971a-4ba9-8079-e0727590d723", APIVersion:"apps/v1", ResourceVersion:"2325", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-564ccc68b9-hq8kp
W0814 13:15:54.728] E0814 13:15:54.727467   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:54.819] E0814 13:15:54.818874   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:54.825] I0814 13:15:54.825087   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788551-20201", Name:"scale-3", UID:"75e017b3-1223-447b-9bde-8b3a6e754e94", APIVersion:"apps/v1", ResourceVersion:"2334", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-564ccc68b9 to 1
W0814 13:15:54.828] I0814 13:15:54.828110   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"scale-3-564ccc68b9", UID:"5da20105-1f22-4445-a7bd-3878648fff2f", APIVersion:"apps/v1", ResourceVersion:"2335", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-564ccc68b9-66ckz
W0814 13:15:54.915] E0814 13:15:54.915182   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:55.013] E0814 13:15:55.013105   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:55.114] deployment.apps/scale-2 created
I0814 13:15:55.114] deployment.apps/scale-3 created
I0814 13:15:55.114] apps.sh:576: Successful get deploy scale-1 {{.spec.replicas}}: 1
I0814 13:15:55.115] (Bapps.sh:577: Successful get deploy scale-2 {{.spec.replicas}}: 1
I0814 13:15:55.115] (Bapps.sh:578: Successful get deploy scale-3 {{.spec.replicas}}: 1
I0814 13:15:55.165] (Bdeployment.apps/scale-1 scaled
... skipping 19 lines ...
W0814 13:15:56.047] I0814 13:15:55.530474   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788551-20201", Name:"scale-2", UID:"61e173da-5e93-4ff4-a13f-69748350ea86", APIVersion:"apps/v1", ResourceVersion:"2365", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-564ccc68b9 to 3
W0814 13:15:56.047] I0814 13:15:55.531459   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"scale-1-564ccc68b9", UID:"15cea090-d0ea-4df2-8353-944e1bc5b219", APIVersion:"apps/v1", ResourceVersion:"2366", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-564ccc68b9-ql56f
W0814 13:15:56.048] I0814 13:15:55.533483   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"scale-2-564ccc68b9", UID:"08df4138-971a-4ba9-8079-e0727590d723", APIVersion:"apps/v1", ResourceVersion:"2368", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-564ccc68b9-fz6sq
W0814 13:15:56.048] I0814 13:15:55.539598   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788551-20201", Name:"scale-3", UID:"75e017b3-1223-447b-9bde-8b3a6e754e94", APIVersion:"apps/v1", ResourceVersion:"2371", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-564ccc68b9 to 3
W0814 13:15:56.049] I0814 13:15:55.546558   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"scale-3-564ccc68b9", UID:"5da20105-1f22-4445-a7bd-3878648fff2f", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-564ccc68b9-lks8j
W0814 13:15:56.049] I0814 13:15:55.550939   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"scale-3-564ccc68b9", UID:"5da20105-1f22-4445-a7bd-3878648fff2f", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-564ccc68b9-pp9hm
W0814 13:15:56.050] E0814 13:15:55.729165   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:56.050] E0814 13:15:55.820172   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:56.051] E0814 13:15:55.916543   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:56.051] I0814 13:15:55.965445   53127 horizontal.go:341] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1565788537-25904
W0814 13:15:56.052] E0814 13:15:56.014125   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:56.117] I0814 13:15:56.116310   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"7a279d7b-8d18-436c-9d0d-c826e7da3fbd", APIVersion:"apps/v1", ResourceVersion:"2426", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wvw9k
W0814 13:15:56.119] I0814 13:15:56.119036   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"7a279d7b-8d18-436c-9d0d-c826e7da3fbd", APIVersion:"apps/v1", ResourceVersion:"2426", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-t9t5q
W0814 13:15:56.121] I0814 13:15:56.121092   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"7a279d7b-8d18-436c-9d0d-c826e7da3fbd", APIVersion:"apps/v1", ResourceVersion:"2426", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-r6hmm
I0814 13:15:56.221] replicaset.apps/frontend created
I0814 13:15:56.222] apps.sh:596: Successful get rs frontend {{.spec.replicas}}: 3
I0814 13:15:56.286] (Bservice/frontend exposed
... skipping 11 lines ...
I0814 13:15:57.229] apps.sh:616: Successful get rs frontend {{.metadata.generation}}: 4
I0814 13:15:57.311] (Bapps.sh:620: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
I0814 13:15:57.383] (Breplicaset.apps "frontend" deleted
I0814 13:15:57.473] apps.sh:624: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:57.554] (Bapps.sh:628: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:15:57.697] (Breplicaset.apps/frontend created
W0814 13:15:57.798] E0814 13:15:56.730607   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:57.798] E0814 13:15:56.821218   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:57.799] E0814 13:15:56.917874   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:57.799] E0814 13:15:57.015659   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:57.799] I0814 13:15:57.701693   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"60e4a692-8a53-4d8b-bbbc-406ea9559764", APIVersion:"apps/v1", ResourceVersion:"2460", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-92rr7
W0814 13:15:57.799] I0814 13:15:57.704239   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"60e4a692-8a53-4d8b-bbbc-406ea9559764", APIVersion:"apps/v1", ResourceVersion:"2460", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tgqnh
W0814 13:15:57.800] I0814 13:15:57.706047   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"60e4a692-8a53-4d8b-bbbc-406ea9559764", APIVersion:"apps/v1", ResourceVersion:"2460", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rdgjt
W0814 13:15:57.800] E0814 13:15:57.731622   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:57.823] E0814 13:15:57.822496   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:57.850] I0814 13:15:57.849751   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"redis-slave", UID:"72ac5854-9a0f-44a9-8dd3-7fac5d266eef", APIVersion:"apps/v1", ResourceVersion:"2469", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-22667
W0814 13:15:57.853] I0814 13:15:57.853222   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"redis-slave", UID:"72ac5854-9a0f-44a9-8dd3-7fac5d266eef", APIVersion:"apps/v1", ResourceVersion:"2469", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-4klzz
W0814 13:15:57.919] E0814 13:15:57.919079   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:58.017] E0814 13:15:58.017121   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:15:58.118] replicaset.apps/redis-slave created
I0814 13:15:58.118] apps.sh:633: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
I0814 13:15:58.118] (Bapps.sh:637: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
I0814 13:15:58.118] (Breplicaset.apps "frontend" deleted
I0814 13:15:58.119] replicaset.apps "redis-slave" deleted
I0814 13:15:58.184] apps.sh:641: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 6 lines ...
I0814 13:15:58.803] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0814 13:15:58.890] apps.sh:656: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0814 13:15:58.962] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W0814 13:15:59.063] I0814 13:15:58.410252   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"fa6a8576-d82f-4e9e-b621-212438560636", APIVersion:"apps/v1", ResourceVersion:"2489", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g2cpd
W0814 13:15:59.064] I0814 13:15:58.413167   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"fa6a8576-d82f-4e9e-b621-212438560636", APIVersion:"apps/v1", ResourceVersion:"2489", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dqntk
W0814 13:15:59.064] I0814 13:15:58.413798   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788551-20201", Name:"frontend", UID:"fa6a8576-d82f-4e9e-b621-212438560636", APIVersion:"apps/v1", ResourceVersion:"2489", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-vkq5w
W0814 13:15:59.064] E0814 13:15:58.732695   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:59.064] E0814 13:15:58.823613   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:59.064] E0814 13:15:58.920264   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:59.065] E0814 13:15:59.018359   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:15:59.065] Error: required flag(s) "max" not set
W0814 13:15:59.065] 
W0814 13:15:59.065] 
W0814 13:15:59.065] Examples:
W0814 13:15:59.065]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0814 13:15:59.065]   kubectl autoscale deployment foo --min=2 --max=10
W0814 13:15:59.065]   
... skipping 87 lines ...
I0814 13:16:01.970] (Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0814 13:16:02.058] (Bapps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0814 13:16:02.154] (Bstatefulset.apps/nginx rolled back
I0814 13:16:02.240] apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0814 13:16:02.329] (Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0814 13:16:02.427] (BSuccessful
I0814 13:16:02.427] message:error: unable to find specified revision 1000000 in history
I0814 13:16:02.427] has:unable to find specified revision
I0814 13:16:02.512] apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0814 13:16:02.596] (Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0814 13:16:02.689] (Bstatefulset.apps/nginx rolled back
I0814 13:16:02.777] apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I0814 13:16:02.864] (Bapps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 8 lines ...
I0814 13:16:03.182] +++ command: run_lists_tests
I0814 13:16:03.196] +++ [0814 13:16:03] Creating namespace namespace-1565788563-4148
I0814 13:16:03.265] namespace/namespace-1565788563-4148 created
I0814 13:16:03.331] Context "test" modified.
I0814 13:16:03.338] +++ [0814 13:16:03] Testing kubectl(v1:lists)
W0814 13:16:03.438] I0814 13:15:59.580866   49644 controller.go:606] quota admission added evaluator for: statefulsets.apps
W0814 13:16:03.439] E0814 13:15:59.734075   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.439] E0814 13:15:59.824995   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.440] I0814 13:15:59.849723   53127 event.go:255] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1565788559-25952", Name:"nginx", UID:"c028b27b-f49c-420c-bb9d-6f0bd3579769", APIVersion:"apps/v1", ResourceVersion:"2514", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
W0814 13:16:03.440] E0814 13:15:59.921258   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.441] E0814 13:16:00.019412   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.441] I0814 13:16:00.325185   53127 stateful_set.go:420] StatefulSet has been deleted namespace-1565788559-25952/nginx
W0814 13:16:03.441] E0814 13:16:00.735456   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.442] E0814 13:16:00.826761   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.442] E0814 13:16:00.922641   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.442] E0814 13:16:01.020614   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.443] E0814 13:16:01.736951   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.443] E0814 13:16:01.828098   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.443] E0814 13:16:01.924232   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.444] E0814 13:16:02.021979   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.444] E0814 13:16:02.738104   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.445] E0814 13:16:02.829326   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.445] E0814 13:16:02.925928   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.445] I0814 13:16:03.018657   53127 stateful_set.go:420] StatefulSet has been deleted namespace-1565788560-12639/nginx
W0814 13:16:03.446] E0814 13:16:03.024117   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:03.506] I0814 13:16:03.505861   53127 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1565788563-4148", Name:"list-deployment-test", UID:"399f8995-6271-4d56-974f-e3693f9faec9", APIVersion:"apps/v1", ResourceVersion:"2550", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7c949b66b4 to 1
W0814 13:16:03.513] I0814 13:16:03.512597   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1565788563-4148", Name:"list-deployment-test-7c949b66b4", UID:"6e8ee2d1-5909-4984-85b7-a78452e5bf6f", APIVersion:"apps/v1", ResourceVersion:"2551", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: list-deployment-test-7c949b66b4-7fbbj
I0814 13:16:03.613] service/list-service-test created
I0814 13:16:03.614] deployment.apps/list-deployment-test created
I0814 13:16:03.614] service "list-service-test" deleted
I0814 13:16:03.614] deployment.apps "list-deployment-test" deleted
... skipping 17 lines ...
I0814 13:16:04.298] (Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
I0814 13:16:04.369] (BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0814 13:16:04.369] service/mock   ClusterIP   10.0.0.55    <none>        99/TCP    0s
I0814 13:16:04.369] 
I0814 13:16:04.369] NAME                         DESIRED   CURRENT   READY   AGE
I0814 13:16:04.370] replicationcontroller/mock   1         1         0       0s
W0814 13:16:04.471] E0814 13:16:03.739468   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:04.471] E0814 13:16:03.830557   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:04.472] E0814 13:16:03.927301   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:04.472] E0814 13:16:04.025367   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:04.473] I0814 13:16:04.133566   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"58c9a0e5-89db-4f3c-90cb-6d4fa36aa552", APIVersion:"v1", ResourceVersion:"2573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-wqmwd
I0814 13:16:04.573] Name:              mock
I0814 13:16:04.574] Namespace:         namespace-1565788563-22309
I0814 13:16:04.574] Labels:            app=mock
I0814 13:16:04.574] Annotations:       <none>
I0814 13:16:04.574] Selector:          app=mock
... skipping 9 lines ...
I0814 13:16:04.575] Name:         mock
I0814 13:16:04.575] Namespace:    namespace-1565788563-22309
I0814 13:16:04.575] Selector:     app=mock
I0814 13:16:04.575] Labels:       app=mock
I0814 13:16:04.575] Annotations:  <none>
I0814 13:16:04.575] Replicas:     1 current / 1 desired
I0814 13:16:04.576] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0814 13:16:04.576] Pod Template:
I0814 13:16:04.576]   Labels:  app=mock
I0814 13:16:04.576]   Containers:
I0814 13:16:04.576]    mock-container:
I0814 13:16:04.576]     Image:        k8s.gcr.io/pause:2.0
I0814 13:16:04.577]     Port:         9949/TCP
... skipping 35 lines ...
I0814 13:16:06.346] (BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0814 13:16:06.346] service/mock   ClusterIP   10.0.0.2     <none>        99/TCP    0s
I0814 13:16:06.346] 
I0814 13:16:06.347] NAME                         DESIRED   CURRENT   READY   AGE
I0814 13:16:06.347] replicationcontroller/mock   1         1         0       0s
W0814 13:16:06.447] I0814 13:16:04.707904   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"28b7a34c-6285-46ae-9e93-abe09ada9fe8", APIVersion:"v1", ResourceVersion:"2587", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-5l25m
W0814 13:16:06.448] E0814 13:16:04.740858   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.448] E0814 13:16:04.831909   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.448] E0814 13:16:04.928508   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.448] E0814 13:16:05.026668   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.449] E0814 13:16:05.742274   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.449] E0814 13:16:05.833176   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.449] E0814 13:16:05.930209   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.449] E0814 13:16:06.027715   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:06.449] I0814 13:16:06.107601   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"1b53e1bf-68d5-4d2e-891a-164db4f0a2be", APIVersion:"v1", ResourceVersion:"2612", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-5gg8z
I0814 13:16:06.550] Name:              mock
I0814 13:16:06.550] Namespace:         namespace-1565788563-22309
I0814 13:16:06.551] Labels:            app=mock
I0814 13:16:06.551] Annotations:       <none>
I0814 13:16:06.551] Selector:          app=mock
... skipping 9 lines ...
I0814 13:16:06.554] Name:         mock
I0814 13:16:06.554] Namespace:    namespace-1565788563-22309
I0814 13:16:06.555] Selector:     app=mock
I0814 13:16:06.555] Labels:       app=mock
I0814 13:16:06.555] Annotations:  <none>
I0814 13:16:06.555] Replicas:     1 current / 1 desired
I0814 13:16:06.556] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0814 13:16:06.556] Pod Template:
I0814 13:16:06.556]   Labels:  app=mock
I0814 13:16:06.556]   Containers:
I0814 13:16:06.556]    mock-container:
I0814 13:16:06.557]     Image:        k8s.gcr.io/pause:2.0
I0814 13:16:06.557]     Port:         9949/TCP
... skipping 35 lines ...
I0814 13:16:08.336] (BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0814 13:16:08.337] service/mock   ClusterIP   10.0.0.81    <none>        99/TCP    0s
I0814 13:16:08.337] 
I0814 13:16:08.337] NAME                         DESIRED   CURRENT   READY   AGE
I0814 13:16:08.337] replicationcontroller/mock   1         1         0       0s
W0814 13:16:08.438] I0814 13:16:06.688431   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"98b2932d-f610-4391-933a-3748761ecc08", APIVersion:"v1", ResourceVersion:"2626", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-689zc
W0814 13:16:08.438] E0814 13:16:06.743521   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.438] E0814 13:16:06.834180   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.439] E0814 13:16:06.931743   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.439] E0814 13:16:07.028786   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.439] E0814 13:16:07.745265   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.439] E0814 13:16:07.835363   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.439] E0814 13:16:07.932881   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.440] E0814 13:16:08.030093   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:08.440] I0814 13:16:08.105108   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"60711e33-0445-49ab-b4be-e48804368c46", APIVersion:"v1", ResourceVersion:"2649", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-fxspb
I0814 13:16:08.540] Name:              mock
I0814 13:16:08.541] Namespace:         namespace-1565788563-22309
I0814 13:16:08.541] Labels:            app=mock
I0814 13:16:08.541] Annotations:       <none>
I0814 13:16:08.541] Selector:          app=mock
... skipping 9 lines ...
I0814 13:16:08.542] Name:         mock
I0814 13:16:08.542] Namespace:    namespace-1565788563-22309
I0814 13:16:08.542] Selector:     app=mock
I0814 13:16:08.542] Labels:       app=mock
I0814 13:16:08.542] Annotations:  <none>
I0814 13:16:08.542] Replicas:     1 current / 1 desired
I0814 13:16:08.542] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0814 13:16:08.542] Pod Template:
I0814 13:16:08.542]   Labels:  app=mock
I0814 13:16:08.543]   Containers:
I0814 13:16:08.543]    mock-container:
I0814 13:16:08.543]     Image:        k8s.gcr.io/pause:2.0
I0814 13:16:08.543]     Port:         9949/TCP
... skipping 32 lines ...
I0814 13:16:10.080] replicationcontroller/mock2 created
I0814 13:16:10.179] generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
I0814 13:16:10.252] (BNAME    DESIRED   CURRENT   READY   AGE
I0814 13:16:10.253] mock    1         1         0       0s
I0814 13:16:10.253] mock2   1         1         0       0s
W0814 13:16:10.354] I0814 13:16:08.667025   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"b0520145-ba24-4c33-b8ab-5a368b92cc4e", APIVersion:"v1", ResourceVersion:"2663", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-mbgrp
W0814 13:16:10.355] E0814 13:16:08.746429   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.355] E0814 13:16:08.836545   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.355] E0814 13:16:08.934013   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.356] E0814 13:16:09.030987   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.356] E0814 13:16:09.747537   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.356] E0814 13:16:09.837862   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.356] E0814 13:16:09.934901   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.357] E0814 13:16:10.032591   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:10.357] I0814 13:16:10.080644   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"9358175e-7192-40b3-a66e-52e1a5918775", APIVersion:"v1", ResourceVersion:"2683", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-s6pnp
W0814 13:16:10.357] I0814 13:16:10.085313   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock2", UID:"774f5d46-b1c7-4a08-8aeb-ed32a7950467", APIVersion:"v1", ResourceVersion:"2685", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-bldp4
I0814 13:16:10.458] Name:         mock
I0814 13:16:10.458] Namespace:    namespace-1565788563-22309
I0814 13:16:10.458] Selector:     app=mock
I0814 13:16:10.458] Labels:       app=mock
I0814 13:16:10.458]               status=replaced
I0814 13:16:10.458] Annotations:  <none>
I0814 13:16:10.458] Replicas:     1 current / 1 desired
I0814 13:16:10.459] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0814 13:16:10.459] Pod Template:
I0814 13:16:10.459]   Labels:  app=mock
I0814 13:16:10.459]   Containers:
I0814 13:16:10.459]    mock-container:
I0814 13:16:10.459]     Image:        k8s.gcr.io/pause:2.0
I0814 13:16:10.459]     Port:         9949/TCP
... skipping 11 lines ...
I0814 13:16:10.460] Namespace:    namespace-1565788563-22309
I0814 13:16:10.460] Selector:     app=mock2
I0814 13:16:10.460] Labels:       app=mock2
I0814 13:16:10.460]               status=replaced
I0814 13:16:10.460] Annotations:  <none>
I0814 13:16:10.461] Replicas:     1 current / 1 desired
I0814 13:16:10.461] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0814 13:16:10.461] Pod Template:
I0814 13:16:10.461]   Labels:  app=mock2
I0814 13:16:10.461]   Containers:
I0814 13:16:10.461]    mock-container:
I0814 13:16:10.461]     Image:        k8s.gcr.io/pause:2.0
I0814 13:16:10.461]     Port:         9949/TCP
... skipping 33 lines ...
I0814 13:16:12.111] generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
I0814 13:16:12.184] (BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0814 13:16:12.185] mock    ClusterIP   10.0.0.181   <none>        99/TCP    0s
I0814 13:16:12.185] mock2   ClusterIP   10.0.0.36    <none>        99/TCP    0s
W0814 13:16:12.286] I0814 13:16:10.579593   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"05f263a0-2d06-4a97-a40f-cbf05f701203", APIVersion:"v1", ResourceVersion:"2699", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-76zzp
W0814 13:16:12.286] I0814 13:16:10.582869   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock2", UID:"63d1a055-897f-408a-83d2-1caadaf96cc1", APIVersion:"v1", ResourceVersion:"2700", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-nnpkk
W0814 13:16:12.286] E0814 13:16:10.748899   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:12.287] E0814 13:16:10.839015   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:12.287] E0814 13:16:10.936713   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:12.287] E0814 13:16:11.033952   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:12.287] E0814 13:16:11.750119   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:12.287] E0814 13:16:11.840527   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:12.288] E0814 13:16:11.938000   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:12.288] E0814 13:16:12.035390   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:16:12.388] Name:              mock
I0814 13:16:12.389] Namespace:         namespace-1565788563-22309
I0814 13:16:12.389] Labels:            app=mock
I0814 13:16:12.389] Annotations:       <none>
I0814 13:16:12.389] Selector:          app=mock
I0814 13:16:12.390] Type:              ClusterIP
... skipping 59 lines ...
I0814 13:16:15.023] Context "test" modified.
I0814 13:16:15.030] +++ [0814 13:16:15] Testing persistent volumes
I0814 13:16:15.115] storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:16:15.282] (Bpersistentvolume/pv0001 created
I0814 13:16:15.372] storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
I0814 13:16:15.444] (Bpersistentvolume "pv0001" deleted
W0814 13:16:15.544] E0814 13:16:12.751570   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.545] E0814 13:16:12.841937   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.545] E0814 13:16:12.940828   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.545] E0814 13:16:13.036731   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.546] I0814 13:16:13.569929   53127 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1565788551-20201
W0814 13:16:15.546] E0814 13:16:13.752782   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.546] E0814 13:16:13.843405   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.546] E0814 13:16:13.942348   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.546] E0814 13:16:14.038326   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.547] I0814 13:16:14.172313   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788563-22309", Name:"mock", UID:"81d131fa-0986-45f1-a042-c34f44668579", APIVersion:"v1", ResourceVersion:"2760", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-54wvj
W0814 13:16:15.547] E0814 13:16:14.754149   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.547] E0814 13:16:14.844860   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.547] E0814 13:16:14.943775   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.547] E0814 13:16:15.039575   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.548] E0814 13:16:15.289359   53127 pv_protection_controller.go:117] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
I0814 13:16:15.648] persistentvolume/pv0002 created
I0814 13:16:15.682] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I0814 13:16:15.756] (Bpersistentvolume "pv0002" deleted
W0814 13:16:15.857] E0814 13:16:15.756024   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.857] E0814 13:16:15.845956   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:15.945] E0814 13:16:15.945104   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:16.041] E0814 13:16:16.041071   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:16:16.142] persistentvolume/pv0003 created
I0814 13:16:16.143] storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
I0814 13:16:16.143] (Bpersistentvolume "pv0003" deleted
I0814 13:16:16.158] storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:16:16.301] (Bpersistentvolume/pv0001 created
I0814 13:16:16.397] storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
... skipping 18 lines ...
I0814 13:16:16.796] Context "test" modified.
I0814 13:16:16.803] +++ [0814 13:16:16] Testing persistent volumes claims
I0814 13:16:16.887] storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:16:17.054] (Bpersistentvolumeclaim/myclaim-1 created
I0814 13:16:17.151] storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
I0814 13:16:17.221] (Bpersistentvolumeclaim "myclaim-1" deleted
W0814 13:16:17.322] E0814 13:16:16.757428   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:17.323] E0814 13:16:16.847501   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:17.323] E0814 13:16:16.946444   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:17.323] E0814 13:16:17.053189   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:17.324] I0814 13:16:17.054386   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-1", UID:"66d4367e-6330-4f2e-81fd-83f1a21e93f7", APIVersion:"v1", ResourceVersion:"2797", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.324] I0814 13:16:17.071931   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-1", UID:"66d4367e-6330-4f2e-81fd-83f1a21e93f7", APIVersion:"v1", ResourceVersion:"2799", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.324] I0814 13:16:17.221128   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-1", UID:"66d4367e-6330-4f2e-81fd-83f1a21e93f7", APIVersion:"v1", ResourceVersion:"2801", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.381] I0814 13:16:17.380859   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-2", UID:"73732091-ba7e-44ca-9f6e-3a3a8293cca1", APIVersion:"v1", ResourceVersion:"2804", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.385] I0814 13:16:17.384454   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-2", UID:"73732091-ba7e-44ca-9f6e-3a3a8293cca1", APIVersion:"v1", ResourceVersion:"2806", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0814 13:16:17.485] persistentvolumeclaim/myclaim-2 created
I0814 13:16:17.485] storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
I0814 13:16:17.554] (Bpersistentvolumeclaim "myclaim-2" deleted
W0814 13:16:17.655] I0814 13:16:17.554250   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-2", UID:"73732091-ba7e-44ca-9f6e-3a3a8293cca1", APIVersion:"v1", ResourceVersion:"2808", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.709] I0814 13:16:17.708565   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-3", UID:"2ff503bc-1c6f-4003-9939-28e0aca4bea2", APIVersion:"v1", ResourceVersion:"2811", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.712] I0814 13:16:17.711323   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-3", UID:"2ff503bc-1c6f-4003-9939-28e0aca4bea2", APIVersion:"v1", ResourceVersion:"2812", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.759] E0814 13:16:17.758611   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:17.849] E0814 13:16:17.849083   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:17.870] I0814 13:16:17.869473   53127 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1565788576-7825", Name:"myclaim-3", UID:"2ff503bc-1c6f-4003-9939-28e0aca4bea2", APIVersion:"v1", ResourceVersion:"2815", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0814 13:16:17.948] E0814 13:16:17.947588   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:16:18.048] persistentvolumeclaim/myclaim-3 created
I0814 13:16:18.049] storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
I0814 13:16:18.049] (Bpersistentvolumeclaim "myclaim-3" deleted
I0814 13:16:18.049] storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
I0814 13:16:18.049] (B+++ exit code: 0
I0814 13:16:18.049] Recording: run_storage_class_tests
... skipping 185 lines ...
I0814 13:16:19.211]   --------           --------  ------
I0814 13:16:19.211]   cpu                0 (0%)    0 (0%)
I0814 13:16:19.211]   memory             0 (0%)    0 (0%)
I0814 13:16:19.211]   ephemeral-storage  0 (0%)    0 (0%)
I0814 13:16:19.211] Events:              <none>
I0814 13:16:19.212] (B
W0814 13:16:19.312] E0814 13:16:18.054977   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:19.313] E0814 13:16:18.759757   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:19.313] E0814 13:16:18.850423   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:19.313] E0814 13:16:18.948996   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:19.314] E0814 13:16:19.056433   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0814 13:16:19.414] Successful describe nodes:
I0814 13:16:19.414] Name:               127.0.0.1
I0814 13:16:19.414] Roles:              <none>
I0814 13:16:19.415] Labels:             <none>
I0814 13:16:19.415] Annotations:        node.alpha.kubernetes.io/ttl: 0
I0814 13:16:19.415] CreationTimestamp:  Wed, 14 Aug 2019 13:12:09 +0000
... skipping 237 lines ...
I0814 13:16:20.488]   "status": {
I0814 13:16:20.488]     "allowed": true,
I0814 13:16:20.488]     "reason": "RBAC: allowed by ClusterRoleBinding \"super-group\" of ClusterRole \"admin\" to Group \"the-group\""
I0814 13:16:20.489]   }
I0814 13:16:20.489] }
I0814 13:16:20.493] +++ exit code: 0
W0814 13:16:20.593] E0814 13:16:19.761149   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:20.594] E0814 13:16:19.851821   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:20.594] E0814 13:16:19.950048   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:20.594] E0814 13:16:20.058056   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:20.594]   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
W0814 13:16:20.594]                                  Dload  Upload   Total   Spent    Left  Speed
W0814 13:16:20.595] 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  1170  100   868  100   302   186k  66286 --:--:-- --:--:-- --:--:--  211k
W0814 13:16:20.595]   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
W0814 13:16:20.595]                                  Dload  Upload   Total   Spent    Left  Speed
W0814 13:16:20.595] 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  1158  100   860  100   298   189k  67102 --:--:-- --:--:-- --:--:--  209k
... skipping 8 lines ...
I0814 13:16:20.755] yes
I0814 13:16:20.755] has:the server doesn't have a resource type
I0814 13:16:20.821] Successful
I0814 13:16:20.821] message:yes
I0814 13:16:20.821] has:yes
I0814 13:16:20.888] Successful
I0814 13:16:20.888] message:error: --subresource can not be used with NonResourceURL
I0814 13:16:20.888] has:subresource can not be used with NonResourceURL
I0814 13:16:20.962] Successful
I0814 13:16:21.038] Successful
I0814 13:16:21.038] message:yes
I0814 13:16:21.039] 0
I0814 13:16:21.039] has:0
... skipping 27 lines ...
I0814 13:16:21.616] role.rbac.authorization.k8s.io/testing-R reconciled
I0814 13:16:21.700] legacy-script.sh:797: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I0814 13:16:21.786] (Blegacy-script.sh:798: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I0814 13:16:21.874] (Blegacy-script.sh:799: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I0814 13:16:21.959] (Blegacy-script.sh:800: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I0814 13:16:22.035] (BSuccessful
I0814 13:16:22.036] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I0814 13:16:22.036] has:only rbac.authorization.k8s.io/v1 is supported
I0814 13:16:22.114] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I0814 13:16:22.120] role.rbac.authorization.k8s.io "testing-R" deleted
I0814 13:16:22.127] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
I0814 13:16:22.134] clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
I0814 13:16:22.143] Recording: run_retrieve_multiple_tests
... skipping 13 lines ...
I0814 13:16:22.404] +++ working dir: /go/src/k8s.io/kubernetes
I0814 13:16:22.406] +++ command: run_resource_aliasing_tests
I0814 13:16:22.419] +++ [0814 13:16:22] Creating namespace namespace-1565788582-17130
I0814 13:16:22.490] namespace/namespace-1565788582-17130 created
I0814 13:16:22.557] Context "test" modified.
I0814 13:16:22.564] +++ [0814 13:16:22] Testing resource aliasing
W0814 13:16:22.664] E0814 13:16:20.762413   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.665] E0814 13:16:20.853287   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.665] E0814 13:16:20.951215   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.666] E0814 13:16:21.059578   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.666] 	reconciliation required create
W0814 13:16:22.666] 	missing rules added:
W0814 13:16:22.666] 		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
W0814 13:16:22.666] 	reconciliation required create
W0814 13:16:22.666] 	missing subjects added:
W0814 13:16:22.666] 		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
W0814 13:16:22.666] 	reconciliation required create
W0814 13:16:22.666] 	missing subjects added:
W0814 13:16:22.666] 		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
W0814 13:16:22.667] 	reconciliation required create
W0814 13:16:22.667] 	missing rules added:
W0814 13:16:22.667] 		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
W0814 13:16:22.667] E0814 13:16:21.763792   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.667] E0814 13:16:21.854404   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.667] E0814 13:16:21.953475   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.668] E0814 13:16:22.060911   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0814 13:16:22.668] warning: deleting cluster-scoped resources, not scoped to the provided namespace
W0814 13:16:22.714] I0814 13:16:22.713440   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788582-17130", Name:"cassandra", UID:"e3f344b8-564a-4ce2-8d86-cd164b40bafc", APIVersion:"v1", ResourceVersion:"2839", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-jwhcg
W0814 13:16:22.718] I0814 13:16:22.717300   53127 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1565788582-17130", Name:"cassandra", UID:"e3f344b8-564a-4ce2-8d86-cd164b40bafc", APIVersion:"v1", ResourceVersion:"2839", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-dh2bb
W0814 13:16:22.765] E0814 13:16:22.765042   53127 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: <