ResultFAILURE
Tests 1 failed / 578 succeeded
Started2018-12-05 22:47
Elapsed25m31s
Versionv1.14.0-alpha.0.854+809eaa70251197
Buildergke-prow-default-pool-3c8994a8-r858
pod8c515806-f8df-11e8-b720-0a580a6c02d1
infra-commit92da5df3d
pod8c515806-f8df-11e8-b720-0a580a6c02d1
repok8s.io/kubernetes
repo-commit809eaa7025119712ca82c6f4dfa73a4a544ad7ec
repos{u'k8s.io/kubernetes': u'master'}

Test Failures


k8s.io/kubernetes/test/integration/apiserver Test202StatusCode 3.59s

go test -v k8s.io/kubernetes/test/integration/apiserver -run Test202StatusCode$
I1205 23:00:55.442035  115374 services.go:33] Network range for service cluster IPs is unspecified. Defaulting to {10.0.0.0 ffffff00}.
I1205 23:00:55.442080  115374 master.go:272] Node port range unspecified. Defaulting to 30000-32767.
I1205 23:00:55.442093  115374 master.go:228] Using reconciler: 
I1205 23:00:55.446609  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.446635  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.446696  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.446824  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.448049  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.454473  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.454512  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.454589  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.454661  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.455286  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.455316  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.455313  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.455489  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.455580  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.456197  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.456769  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.456873  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.463511  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.463617  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.464722  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.464814  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.464838  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.464875  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.464916  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.465268  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.465759  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.465776  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.465811  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.465955  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.466252  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.467207  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.467229  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.467310  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.467370  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.469164  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.469404  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.469428  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.469466  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.469513  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.470919  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.471180  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.471195  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.471274  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.471314  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.471861  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.471965  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.471981  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.472013  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.472166  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.473084  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.473438  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.473459  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.473491  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.473556  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.475496  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.477946  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.477964  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.477994  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.478045  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.479223  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.479243  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.479842  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.480693  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.481179  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.484893  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.485388  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.485436  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.485530  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.485623  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.493321  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.494063  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.494087  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.494139  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.494249  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.495639  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.495678  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.495755  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.495827  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.495835  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.496170  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.496370  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.496390  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.496424  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.496462  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.496974  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.515370  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.515396  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.515497  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.515628  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.516802  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.516825  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.516860  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.517140  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.517731  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.517749  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.517782  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.517870  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.518142  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.518676  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.518690  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.518727  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.518789  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.519034  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.519741  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.519756  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.519784  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.519839  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.520057  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.520679  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.520694  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.520721  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.520795  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.521004  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.526865  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.528023  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.526956  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.528164  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.528392  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.530332  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.541990  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.543117  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.543142  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.543233  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.543442  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.544505  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.544525  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.544583  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.544708  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.544997  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.547537  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.547564  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.547612  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.547710  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.548071  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.548426  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.550337  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.550369  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.550434  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.564441  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.565061  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.566189  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.566218  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.566260  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.566513  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.567611  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.567657  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.567705  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.567761  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.567881  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.568723  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.568784  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.568838  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.568947  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.569235  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.569985  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.570007  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.570082  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.570125  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.570180  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.570457  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.570728  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.570744  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.570778  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.570816  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.571048  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.571569  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.571585  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.571614  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.571691  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.571990  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.572265  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.572281  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.572311  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.572361  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.572632  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.573365  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.573378  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.573463  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.573531  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.574079  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.574094  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.574343  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.574517  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.574569  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.574752  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.583717  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.583925  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.584031  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.584173  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.584577  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.584831  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.584878  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.584949  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.585014  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.585274  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.585560  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.585585  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.585615  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.585725  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.586398  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.586608  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.586631  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.586659  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.586705  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.587559  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.587607  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.587640  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.587810  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.588027  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.588220  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.590573  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.590589  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.590620  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.590704  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.591078  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.591527  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.591547  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.591580  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.591804  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.592475  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.592502  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.592533  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.592667  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.592883  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.594260  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.594431  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.594452  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.594483  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.594540  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.595040  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.595055  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.595083  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.595185  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.595357  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.596019  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.596044  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.596071  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.596166  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.596339  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.597150  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.598761  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.598779  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.598806  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.598852  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.599331  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.599748  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.599767  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.599795  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.599853  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.600871  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.600895  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.600927  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.601054  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.601286  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.601864  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.601879  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.601911  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.601983  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.602223  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.602778  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.602793  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.602822  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.602898  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.603148  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.603688  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.603703  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.603733  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.603796  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.604058  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.607593  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.607799  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.607814  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.607859  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.607903  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.608198  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.608432  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.608450  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.608477  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.608743  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.609249  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.609504  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.609520  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.609620  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.609881  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.610128  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.618137  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.618168  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.618203  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.618508  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.623607  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.624433  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.624456  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.624500  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.624804  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.626205  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.626226  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.626252  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.626315  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.626874  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.634963  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.634995  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.635034  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.635155  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.635349  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.637051  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.637684  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.637708  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.637742  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.637810  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.638181  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.638576  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:55.638601  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:55.638635  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:55.638689  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:55.639486  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 23:00:55.646179  115374 genericapiserver.go:334] Skipping API batch/v2alpha1 because it has no resources.
W1205 23:00:55.661901  115374 genericapiserver.go:334] Skipping API rbac.authorization.k8s.io/v1alpha1 because it has no resources.
W1205 23:00:55.662668  115374 genericapiserver.go:334] Skipping API scheduling.k8s.io/v1alpha1 because it has no resources.
W1205 23:00:55.665407  115374 genericapiserver.go:334] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W1205 23:00:55.682402  115374 genericapiserver.go:334] Skipping API admissionregistration.k8s.io/v1alpha1 because it has no resources.
I1205 23:00:56.441857  115374 clientconn.go:551] parsed scheme: ""
I1205 23:00:56.441888  115374 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1205 23:00:56.441954  115374 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1205 23:00:56.442015  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:56.442621  115374 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1205 23:00:56.691267  115374 storage_scheduling.go:91] created PriorityClass system-node-critical with value 2000001000
I1205 23:00:56.695815  115374 storage_scheduling.go:91] created PriorityClass system-cluster-critical with value 2000000000
I1205 23:00:56.695842  115374 storage_scheduling.go:100] all system priority classes are created successfully or already exist.
I1205 23:00:56.708398  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/cluster-admin
I1205 23:00:56.711005  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:discovery
I1205 23:00:56.713715  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:basic-user
I1205 23:00:56.716471  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/admin
I1205 23:00:56.718919  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/edit
I1205 23:00:56.721818  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/view
I1205 23:00:56.724567  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-admin
I1205 23:00:56.727626  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-edit
I1205 23:00:56.730953  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-view
I1205 23:00:56.733690  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:heapster
I1205 23:00:56.745450  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node
I1205 23:00:56.752806  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node-problem-detector
I1205 23:00:56.755640  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node-proxier
I1205 23:00:56.758171  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kubelet-api-admin
I1205 23:00:56.761066  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node-bootstrapper
I1205 23:00:56.766235  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:auth-delegator
I1205 23:00:56.775371  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-aggregator
I1205 23:00:56.779078  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-controller-manager
I1205 23:00:56.782750  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-scheduler
I1205 23:00:56.785741  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-dns
I1205 23:00:56.789437  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:persistent-volume-provisioner
I1205 23:00:56.792445  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:csi-external-attacher
I1205 23:00:56.795297  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aws-cloud-provider
I1205 23:00:56.797738  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:certificates.k8s.io:certificatesigningrequests:nodeclient
I1205 23:00:56.806773  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:certificates.k8s.io:certificatesigningrequests:selfnodeclient
I1205 23:00:56.809438  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:volume-scheduler
I1205 23:00:56.812389  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:csi-external-provisioner
I1205 23:00:56.817957  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:attachdetach-controller
I1205 23:00:56.820995  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:clusterrole-aggregation-controller
I1205 23:00:56.823969  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:cronjob-controller
I1205 23:00:56.826342  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:daemon-set-controller
I1205 23:00:56.828762  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:deployment-controller
I1205 23:00:56.831466  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:disruption-controller
I1205 23:00:56.833993  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:endpoint-controller
I1205 23:00:56.836959  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:expand-controller
I1205 23:00:56.839443  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:generic-garbage-collector
I1205 23:00:56.842206  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:horizontal-pod-autoscaler
I1205 23:00:56.845142  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:job-controller
I1205 23:00:56.847847  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:namespace-controller
I1205 23:00:56.852723  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:node-controller
I1205 23:00:56.855793  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:persistent-volume-binder
I1205 23:00:56.858293  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:pod-garbage-collector
I1205 23:00:56.860741  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:replicaset-controller
I1205 23:00:56.863293  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:replication-controller
I1205 23:00:56.865799  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:resourcequota-controller
I1205 23:00:56.868003  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:route-controller
I1205 23:00:56.870486  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:service-account-controller
I1205 23:00:56.873539  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:service-controller
I1205 23:00:56.876383  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:statefulset-controller
I1205 23:00:56.879198  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:ttl-controller
I1205 23:00:56.887094  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:certificate-controller
I1205 23:00:56.928332  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:pvc-protection-controller
I1205 23:00:56.968272  115374 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:pv-protection-controller
I1205 23:00:57.009098  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/cluster-admin
I1205 23:00:57.048075  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:discovery
I1205 23:00:57.087820  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:basic-user
I1205 23:00:57.127892  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:node-proxier
I1205 23:00:57.167999  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-controller-manager
I1205 23:00:57.207750  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-dns
I1205 23:00:57.247951  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-scheduler
I1205 23:00:57.287750  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:aws-cloud-provider
I1205 23:00:57.327848  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:node
I1205 23:00:57.367927  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:volume-scheduler
I1205 23:00:57.407780  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:attachdetach-controller
I1205 23:00:57.448008  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:clusterrole-aggregation-controller
I1205 23:00:57.487962  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:cronjob-controller
I1205 23:00:57.527642  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:daemon-set-controller
I1205 23:00:57.568197  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:deployment-controller
I1205 23:00:57.607526  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:disruption-controller
I1205 23:00:57.647800  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:endpoint-controller
I1205 23:00:57.701960  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:expand-controller
I1205 23:00:57.727787  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:generic-garbage-collector
I1205 23:00:57.767848  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:horizontal-pod-autoscaler
I1205 23:00:57.809197  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:job-controller
I1205 23:00:57.847709  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:namespace-controller
I1205 23:00:57.888113  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:node-controller
I1205 23:00:57.927847  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:persistent-volume-binder
I1205 23:00:57.967721  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pod-garbage-collector
I1205 23:00:58.007899  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:replicaset-controller
I1205 23:00:58.047557  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:replication-controller
I1205 23:00:58.087792  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:resourcequota-controller
I1205 23:00:58.127473  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:route-controller
I1205 23:00:58.167684  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:service-account-controller
I1205 23:00:58.207520  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:service-controller
I1205 23:00:58.247655  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:statefulset-controller
I1205 23:00:58.287468  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:ttl-controller
I1205 23:00:58.327830  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:certificate-controller
I1205 23:00:58.371779  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pvc-protection-controller
I1205 23:00:58.407767  115374 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pv-protection-controller
I1205 23:00:58.457218  115374 storage_rbac.go:246] created role.rbac.authorization.k8s.io/extension-apiserver-authentication-reader in kube-system
I1205 23:00:58.487331  115374 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-system
I1205 23:00:58.536866  115374 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:cloud-provider in kube-system
I1205 23:00:58.567594  115374 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:token-cleaner in kube-system
I1205 23:00:58.612647  115374 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system::leader-locking-kube-controller-manager in kube-system
I1205 23:00:58.653536  115374 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system::leader-locking-kube-scheduler in kube-system
I1205 23:00:58.687729  115374 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-public
I1205 23:00:58.729173  115374 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system::leader-locking-kube-controller-manager in kube-system
I1205 23:00:58.772835  115374 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system::leader-locking-kube-scheduler in kube-system
I1205 23:00:58.807853  115374 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-system
I1205 23:00:58.847776  115374 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:cloud-provider in kube-system
I1205 23:00:58.890541  115374 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:token-cleaner in kube-system
I1205 23:00:58.927410  115374 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-public
I1205 23:00:59.021240  115374 controller.go:170] Shutting down kubernetes service endpoint reconciler
				from junit_f5a444384056ebac4f2929ce7b7920ea9733ca19_20181205-230003.xml

Filter through log files | View test history on testgrid


Show 578 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 10 lines ...
I1205 22:47:10.765] process 238 exited with code 0 after 0.1m
I1205 22:47:10.766] Call:  gcloud config get-value account
I1205 22:47:11.077] process 251 exited with code 0 after 0.0m
I1205 22:47:11.077] Will upload results to gs://kubernetes-jenkins/logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I1205 22:47:11.077] Call:  kubectl get -oyaml pods/8c515806-f8df-11e8-b720-0a580a6c02d1
W1205 22:47:13.132] The connection to the server localhost:8080 was refused - did you specify the right host or port?
E1205 22:47:13.134] Command failed
I1205 22:47:13.134] process 264 exited with code 1 after 0.0m
E1205 22:47:13.134] unable to upload podspecs: Command '['kubectl', 'get', '-oyaml', 'pods/8c515806-f8df-11e8-b720-0a580a6c02d1']' returned non-zero exit status 1
I1205 22:47:13.135] Root: /workspace
I1205 22:47:13.135] cd to /workspace
I1205 22:47:13.135] Checkout: /workspace/k8s.io/kubernetes master to /workspace/k8s.io/kubernetes
I1205 22:47:13.135] Call:  git init k8s.io/kubernetes
... skipping 805 lines ...
W1205 22:55:19.548] W1205 22:55:19.547885   55672 controllermanager.go:508] Skipping "ttl-after-finished"
W1205 22:55:19.548] I1205 22:55:19.547522   55672 endpoints_controller.go:149] Starting endpoint controller
W1205 22:55:19.549] I1205 22:55:19.548728   55672 controller_utils.go:1027] Waiting for caches to sync for endpoint controller
W1205 22:55:19.549] I1205 22:55:19.549682   55672 controllermanager.go:516] Started "job"
W1205 22:55:19.550] I1205 22:55:19.549829   55672 job_controller.go:143] Starting job controller
W1205 22:55:19.550] I1205 22:55:19.549851   55672 controller_utils.go:1027] Waiting for caches to sync for job controller
W1205 22:55:19.550] W1205 22:55:19.549963   55672 garbagecollector.go:649] failed to discover preferred resources: the cache has not been filled yet
W1205 22:55:19.550] I1205 22:55:19.550493   55672 controllermanager.go:516] Started "garbagecollector"
W1205 22:55:19.550] I1205 22:55:19.550529   55672 garbagecollector.go:133] Starting garbage collector controller
W1205 22:55:19.550] I1205 22:55:19.550544   55672 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1205 22:55:19.551] I1205 22:55:19.550574   55672 graph_builder.go:308] GraphBuilder running
W1205 22:55:19.551] I1205 22:55:19.551230   55672 controllermanager.go:516] Started "disruption"
W1205 22:55:19.551] I1205 22:55:19.551336   55672 disruption.go:288] Starting disruption controller
W1205 22:55:19.551] I1205 22:55:19.551364   55672 controller_utils.go:1027] Waiting for caches to sync for disruption controller
W1205 22:55:19.551] I1205 22:55:19.551537   55672 controllermanager.go:516] Started "podgc"
W1205 22:55:19.551] I1205 22:55:19.551702   55672 gc_controller.go:76] Starting GC controller
W1205 22:55:19.552] I1205 22:55:19.551733   55672 controller_utils.go:1027] Waiting for caches to sync for GC controller
W1205 22:55:19.552] E1205 22:55:19.552206   55672 core.go:76] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W1205 22:55:19.552] W1205 22:55:19.552226   55672 controllermanager.go:508] Skipping "service"
W1205 22:55:19.553] I1205 22:55:19.552542   55672 node_lifecycle_controller.go:272] Sending events to api server.
W1205 22:55:19.553] I1205 22:55:19.552637   55672 node_lifecycle_controller.go:312] Controller is using taint based evictions.
W1205 22:55:19.553] I1205 22:55:19.552696   55672 taint_manager.go:175] Sending events to api server.
W1205 22:55:19.553] I1205 22:55:19.552813   55672 node_lifecycle_controller.go:378] Controller will taint node by condition.
W1205 22:55:19.553] I1205 22:55:19.552863   55672 controllermanager.go:516] Started "nodelifecycle"
... skipping 51 lines ...
W1205 22:55:19.579] I1205 22:55:19.577974   55672 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for daemonsets.apps
W1205 22:55:19.580] I1205 22:55:19.578006   55672 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for jobs.batch
W1205 22:55:19.580] I1205 22:55:19.578052   55672 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for cronjobs.batch
W1205 22:55:19.580] I1205 22:55:19.578149   55672 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for daemonsets.extensions
W1205 22:55:19.580] I1205 22:55:19.578266   55672 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for deployments.extensions
W1205 22:55:19.580] I1205 22:55:19.578340   55672 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for networkpolicies.networking.k8s.io
W1205 22:55:19.580] E1205 22:55:19.578375   55672 resource_quota_controller.go:171] initial monitor sync has error: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W1205 22:55:19.580] I1205 22:55:19.578397   55672 controllermanager.go:516] Started "resourcequota"
W1205 22:55:19.581] I1205 22:55:19.578958   55672 controllermanager.go:516] Started "pv-protection"
W1205 22:55:19.581] W1205 22:55:19.578973   55672 controllermanager.go:495] "tokencleaner" is disabled
W1205 22:55:19.581] I1205 22:55:19.579383   55672 controllermanager.go:516] Started "cronjob"
W1205 22:55:19.581] I1205 22:55:19.579865   55672 controllermanager.go:516] Started "clusterrole-aggregation"
W1205 22:55:19.581] I1205 22:55:19.580519   55672 controllermanager.go:516] Started "replicationcontroller"
... skipping 19 lines ...
W1205 22:55:19.657] I1205 22:55:19.657179   55672 controller_utils.go:1034] Caches are synced for ReplicaSet controller
W1205 22:55:19.659] I1205 22:55:19.659284   52321 controller.go:608] quota admission added evaluator for: serviceaccounts
W1205 22:55:19.669] I1205 22:55:19.668775   55672 controller_utils.go:1034] Caches are synced for namespace controller
W1205 22:55:19.671] I1205 22:55:19.671192   55672 controller_utils.go:1034] Caches are synced for PVC protection controller
W1205 22:55:19.684] I1205 22:55:19.684455   55672 controller_utils.go:1034] Caches are synced for ClusterRoleAggregator controller
W1205 22:55:19.685] I1205 22:55:19.684495   55672 controller_utils.go:1034] Caches are synced for ReplicationController controller
W1205 22:55:19.692] E1205 22:55:19.691947   55672 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W1205 22:55:19.694] E1205 22:55:19.694426   55672 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
W1205 22:55:19.746] I1205 22:55:19.745740   55672 controller_utils.go:1034] Caches are synced for certificate controller
W1205 22:55:19.746] I1205 22:55:19.746583   55672 controller_utils.go:1034] Caches are synced for expand controller
W1205 22:55:19.783] I1205 22:55:19.783404   55672 controller_utils.go:1034] Caches are synced for PV protection controller
W1205 22:55:19.882] I1205 22:55:19.881852   55672 controller_utils.go:1034] Caches are synced for resource quota controller
W1205 22:55:19.885] W1205 22:55:19.884953   55672 actual_state_of_world.go:491] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W1205 22:55:19.957] I1205 22:55:19.957199   55672 controller_utils.go:1034] Caches are synced for attach detach controller
W1205 22:55:19.958] I1205 22:55:19.957218   55672 controller_utils.go:1034] Caches are synced for taint controller
W1205 22:55:19.958] I1205 22:55:19.957512   55672 node_lifecycle_controller.go:1222] Initializing eviction metric for zone: 
W1205 22:55:19.958] I1205 22:55:19.957554   55672 taint_manager.go:198] Starting NoExecuteTaintManager
W1205 22:55:19.958] I1205 22:55:19.957621   55672 node_lifecycle_controller.go:1072] Controller detected that all Nodes are not-Ready. Entering master disruption mode.
W1205 22:55:19.959] I1205 22:55:19.957897   55672 event.go:221] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"127.0.0.1", UID:"d6f471f5-f8e0-11e8-8d22-0242ac110002", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node 127.0.0.1 event: Registered Node 127.0.0.1 in Controller
... skipping 38 lines ...
I1205 22:55:21.030] Successful: --short --output client json info is equal to non short result
I1205 22:55:21.037] (BSuccessful: --short --output server json info is equal to non short result
I1205 22:55:21.040] (B+++ [1205 22:55:21] Testing kubectl version: compare json output with yaml output
W1205 22:55:21.141] I1205 22:55:21.041970   55672 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1205 22:55:21.141] I1205 22:55:21.050716   55672 controller_utils.go:1034] Caches are synced for garbage collector controller
W1205 22:55:21.141] I1205 22:55:21.050742   55672 garbagecollector.go:142] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
W1205 22:55:21.142] E1205 22:55:21.072566   55672 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W1205 22:55:21.143] I1205 22:55:21.142809   55672 controller_utils.go:1034] Caches are synced for garbage collector controller
I1205 22:55:21.243] Successful: --output json/yaml has identical information
I1205 22:55:21.244] (B+++ exit code: 0
I1205 22:55:21.244] Recording: run_kubectl_config_set_tests
I1205 22:55:21.244] Running command: run_kubectl_config_set_tests
I1205 22:55:21.244] 
... skipping 40 lines ...
I1205 22:55:23.684] +++ working dir: /go/src/k8s.io/kubernetes
I1205 22:55:23.686] +++ command: run_RESTMapper_evaluation_tests
I1205 22:55:23.695] +++ [1205 22:55:23] Creating namespace namespace-1544050523-496
I1205 22:55:23.780] namespace/namespace-1544050523-496 created
I1205 22:55:23.857] Context "test" modified.
I1205 22:55:23.862] +++ [1205 22:55:23] Testing RESTMapper
I1205 22:55:23.986] +++ [1205 22:55:23] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I1205 22:55:23.998] +++ exit code: 0
I1205 22:55:24.098] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I1205 22:55:24.098] bindings                                                                      true         Binding
I1205 22:55:24.098] componentstatuses                 cs                                          false        ComponentStatus
I1205 22:55:24.098] configmaps                        cm                                          true         ConfigMap
I1205 22:55:24.098] endpoints                         ep                                          true         Endpoints
... skipping 606 lines ...
I1205 22:55:41.961] (Bpoddisruptionbudget.policy/test-pdb-3 created
I1205 22:55:42.047] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I1205 22:55:42.115] (Bpoddisruptionbudget.policy/test-pdb-4 created
I1205 22:55:42.197] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I1205 22:55:42.342] (Bcore.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:55:42.506] (Bpod/env-test-pod created
W1205 22:55:42.607] error: resource(s) were provided, but no name, label selector, or --all flag specified
W1205 22:55:42.607] error: setting 'all' parameter but found a non empty selector. 
W1205 22:55:42.607] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1205 22:55:42.607] I1205 22:55:41.655224   52321 controller.go:608] quota admission added evaluator for: poddisruptionbudgets.policy
W1205 22:55:42.607] error: min-available and max-unavailable cannot be both specified
I1205 22:55:42.708] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I1205 22:55:42.708] Name:               env-test-pod
I1205 22:55:42.708] Namespace:          test-kubectl-describe-pod
I1205 22:55:42.708] Priority:           0
I1205 22:55:42.708] PriorityClassName:  <none>
I1205 22:55:42.708] Node:               <none>
... skipping 145 lines ...
W1205 22:55:53.944] I1205 22:55:53.334555   55672 namespace_controller.go:171] Namespace has been deleted test-kubectl-describe-pod
W1205 22:55:53.945] I1205 22:55:53.531825   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050549-5610", Name:"modified", UID:"eb165798-f8e0-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"367", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: modified-rnm5h
I1205 22:55:54.066] core.sh:434: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:55:54.204] (Bpod/valid-pod created
I1205 22:55:54.290] core.sh:438: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1205 22:55:54.431] (BSuccessful
I1205 22:55:54.431] message:Error from server: cannot restore map from string
I1205 22:55:54.431] has:cannot restore map from string
I1205 22:55:54.511] Successful
I1205 22:55:54.512] message:pod/valid-pod patched (no change)
I1205 22:55:54.512] has:patched (no change)
I1205 22:55:54.585] pod/valid-pod patched
I1205 22:55:54.670] core.sh:455: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
... skipping 5 lines ...
I1205 22:55:55.132] (Bpod/valid-pod patched
I1205 22:55:55.220] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I1205 22:55:55.287] (Bpod/valid-pod patched
I1205 22:55:55.369] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I1205 22:55:55.512] (Bpod/valid-pod patched
I1205 22:55:55.598] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I1205 22:55:55.752] (B+++ [1205 22:55:55] "kubectl patch with resourceVersion 486" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
W1205 22:55:55.852] E1205 22:55:54.424192   52321 status.go:64] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"cannot restore map from string"}
I1205 22:55:55.966] pod "valid-pod" deleted
I1205 22:55:55.976] pod/valid-pod replaced
I1205 22:55:56.062] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I1205 22:55:56.201] (BSuccessful
I1205 22:55:56.202] message:error: --grace-period must have --force specified
I1205 22:55:56.202] has:\-\-grace-period must have \-\-force specified
I1205 22:55:56.344] Successful
I1205 22:55:56.344] message:error: --timeout must have --force specified
I1205 22:55:56.344] has:\-\-timeout must have \-\-force specified
W1205 22:55:56.489] W1205 22:55:56.488715   55672 actual_state_of_world.go:491] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I1205 22:55:56.590] node/node-v1-test created
I1205 22:55:56.640] node/node-v1-test replaced
I1205 22:55:56.729] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I1205 22:55:56.801] (Bnode "node-v1-test" deleted
I1205 22:55:56.890] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I1205 22:55:57.135] (Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
... skipping 58 lines ...
I1205 22:56:01.599] save-config.sh:31: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:01.735] (Bpod/test-pod created
W1205 22:56:01.835] Edit cancelled, no changes made.
W1205 22:56:01.836] Edit cancelled, no changes made.
W1205 22:56:01.836] Edit cancelled, no changes made.
W1205 22:56:01.836] Edit cancelled, no changes made.
W1205 22:56:01.836] error: 'name' already has a value (valid-pod), and --overwrite is false
W1205 22:56:01.836] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1205 22:56:01.836] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I1205 22:56:01.937] pod "test-pod" deleted
I1205 22:56:01.937] +++ [1205 22:56:01] Creating namespace namespace-1544050561-1180
I1205 22:56:01.960] namespace/namespace-1544050561-1180 created
I1205 22:56:02.020] Context "test" modified.
... skipping 41 lines ...
I1205 22:56:04.873] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I1205 22:56:04.875] +++ working dir: /go/src/k8s.io/kubernetes
I1205 22:56:04.877] +++ command: run_kubectl_create_error_tests
I1205 22:56:04.889] +++ [1205 22:56:04] Creating namespace namespace-1544050564-2500
I1205 22:56:04.955] namespace/namespace-1544050564-2500 created
I1205 22:56:05.017] Context "test" modified.
I1205 22:56:05.022] +++ [1205 22:56:05] Testing kubectl create with error
W1205 22:56:05.123] Error: required flag(s) "filename" not set
W1205 22:56:05.123] 
W1205 22:56:05.123] 
W1205 22:56:05.123] Examples:
W1205 22:56:05.123]   # Create a pod using the data in pod.json.
W1205 22:56:05.123]   kubectl create -f ./pod.json
W1205 22:56:05.123]   
... skipping 38 lines ...
W1205 22:56:05.127]   kubectl create -f FILENAME [options]
W1205 22:56:05.127] 
W1205 22:56:05.127] Use "kubectl <command> --help" for more information about a given command.
W1205 22:56:05.127] Use "kubectl options" for a list of global command-line options (applies to all commands).
W1205 22:56:05.128] 
W1205 22:56:05.128] required flag(s) "filename" not set
I1205 22:56:05.228] +++ [1205 22:56:05] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W1205 22:56:05.329] kubectl convert is DEPRECATED and will be removed in a future version.
W1205 22:56:05.329] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I1205 22:56:05.429] +++ exit code: 0
I1205 22:56:05.430] Recording: run_kubectl_apply_tests
I1205 22:56:05.430] Running command: run_kubectl_apply_tests
I1205 22:56:05.430] 
... skipping 13 lines ...
I1205 22:56:06.390] apply.sh:47: Successful get deployments {{range.items}}{{.metadata.name}}{{end}}: test-deployment-retainkeys
I1205 22:56:07.223] (Bdeployment.extensions "test-deployment-retainkeys" deleted
I1205 22:56:07.310] apply.sh:67: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:07.452] (Bpod/selector-test-pod created
I1205 22:56:07.544] apply.sh:71: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I1205 22:56:07.624] (BSuccessful
I1205 22:56:07.624] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I1205 22:56:07.624] has:pods "selector-test-pod-dont-apply" not found
I1205 22:56:07.696] pod "selector-test-pod" deleted
I1205 22:56:07.786] apply.sh:80: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:07.994] (Bpod/test-pod created (server dry run)
I1205 22:56:08.082] apply.sh:85: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:08.225] (Bpod/test-pod created
... skipping 6 lines ...
W1205 22:56:08.327] I1205 22:56:06.851143   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050565-896", Name:"test-deployment-retainkeys", UID:"f2b0ab03-f8e0-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"496", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-deployment-retainkeys-7495cff5f to 1
W1205 22:56:08.328] I1205 22:56:06.855360   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050565-896", Name:"test-deployment-retainkeys-7495cff5f", UID:"f307285d-f8e0-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"498", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-deployment-retainkeys-7495cff5f-z7mbk
I1205 22:56:08.428] pod/test-pod configured (server dry run)
I1205 22:56:08.475] apply.sh:91: Successful get pods test-pod {{.metadata.labels.name}}: test-pod-label
I1205 22:56:08.549] (Bpod "test-pod" deleted
I1205 22:56:08.764] customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
W1205 22:56:08.865] E1205 22:56:08.768218   52321 autoregister_controller.go:190] v1alpha1.mygroup.example.com failed with : apiservices.apiregistration.k8s.io "v1alpha1.mygroup.example.com" already exists
W1205 22:56:08.944] I1205 22:56:08.943978   52321 clientconn.go:551] parsed scheme: ""
W1205 22:56:08.945] I1205 22:56:08.944011   52321 clientconn.go:557] scheme "" not registered, fallback to default scheme
W1205 22:56:08.945] I1205 22:56:08.944059   52321 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W1205 22:56:08.945] I1205 22:56:08.944131   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:56:08.945] I1205 22:56:08.944760   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:56:09.018] I1205 22:56:09.018011   52321 controller.go:608] quota admission added evaluator for: resources.mygroup.example.com
W1205 22:56:09.103] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I1205 22:56:09.204] kind.mygroup.example.com/myobj created (server dry run)
I1205 22:56:09.204] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I1205 22:56:09.286] apply.sh:129: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:09.445] (Bpod/a created
I1205 22:56:10.946] apply.sh:134: Successful get pods a {{.metadata.name}}: a
I1205 22:56:11.031] (BSuccessful
I1205 22:56:11.032] message:Error from server (NotFound): pods "b" not found
I1205 22:56:11.032] has:pods "b" not found
I1205 22:56:11.175] pod/b created
I1205 22:56:11.189] pod/a pruned
I1205 22:56:12.871] apply.sh:142: Successful get pods b {{.metadata.name}}: b
I1205 22:56:12.951] (BSuccessful
I1205 22:56:12.951] message:Error from server (NotFound): pods "a" not found
I1205 22:56:12.951] has:pods "a" not found
I1205 22:56:13.024] pod "b" deleted
I1205 22:56:13.111] apply.sh:152: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:13.267] (Bpod/a created
I1205 22:56:13.359] apply.sh:157: Successful get pods a {{.metadata.name}}: a
I1205 22:56:13.439] (BSuccessful
I1205 22:56:13.439] message:Error from server (NotFound): pods "b" not found
I1205 22:56:13.439] has:pods "b" not found
I1205 22:56:13.583] pod/b created
I1205 22:56:13.672] apply.sh:165: Successful get pods a {{.metadata.name}}: a
I1205 22:56:13.755] (Bapply.sh:166: Successful get pods b {{.metadata.name}}: b
I1205 22:56:13.827] (Bpod "a" deleted
I1205 22:56:13.831] pod "b" deleted
I1205 22:56:13.982] Successful
I1205 22:56:13.983] message:error: all resources selected for prune without explicitly passing --all. To prune all resources, pass the --all flag. If you did not mean to prune all resources, specify a label selector
I1205 22:56:13.983] has:all resources selected for prune without explicitly passing --all
I1205 22:56:14.126] pod/a created
I1205 22:56:14.132] pod/b created
I1205 22:56:14.139] service/prune-svc created
I1205 22:56:15.635] apply.sh:178: Successful get pods a {{.metadata.name}}: a
I1205 22:56:15.730] (Bapply.sh:179: Successful get pods b {{.metadata.name}}: b
... skipping 138 lines ...
W1205 22:56:27.977] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W1205 22:56:27.977] I1205 22:56:27.200236   52321 controller.go:608] quota admission added evaluator for: cronjobs.batch
I1205 22:56:28.077] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:28.187] (Bpod/selector-test-pod created
I1205 22:56:28.311] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I1205 22:56:28.421] (BSuccessful
I1205 22:56:28.421] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I1205 22:56:28.421] has:pods "selector-test-pod-dont-apply" not found
I1205 22:56:28.516] pod "selector-test-pod" deleted
I1205 22:56:28.540] +++ exit code: 0
I1205 22:56:28.575] Recording: run_kubectl_apply_deployments_tests
I1205 22:56:28.576] Running command: run_kubectl_apply_deployments_tests
I1205 22:56:28.596] 
... skipping 26 lines ...
I1205 22:56:30.791] (Bapps.sh:131: Successful get deployments my-depl {{.metadata.labels.l2}}: l2
I1205 22:56:30.905] (Bdeployment.extensions "my-depl" deleted
I1205 22:56:30.912] replicaset.extensions "my-depl-559b7bc95d" deleted
I1205 22:56:30.916] replicaset.extensions "my-depl-6676598dcb" deleted
I1205 22:56:30.925] pod "my-depl-559b7bc95d-xpmhm" deleted
I1205 22:56:30.930] pod "my-depl-6676598dcb-9jx9x" deleted
W1205 22:56:31.031] E1205 22:56:30.932419   55672 replica_set.go:450] Sync "namespace-1544050588-14520/my-depl-6676598dcb" failed with replicasets.apps "my-depl-6676598dcb" not found
I1205 22:56:31.131] apps.sh:137: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:31.188] (Bapps.sh:138: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:31.302] (Bapps.sh:139: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:31.418] (Bapps.sh:143: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:31.620] (Bdeployment.extensions/nginx created
W1205 22:56:31.721] I1205 22:56:31.624733   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050588-14520", Name:"nginx", UID:"01ca7ce6-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"691", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-5d56d6b95f to 3
W1205 22:56:31.722] I1205 22:56:31.628952   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050588-14520", Name:"nginx-5d56d6b95f", UID:"01cb1d9d-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"692", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5d56d6b95f-hw667
W1205 22:56:31.723] I1205 22:56:31.631983   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050588-14520", Name:"nginx-5d56d6b95f", UID:"01cb1d9d-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"692", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5d56d6b95f-8b7pp
W1205 22:56:31.723] I1205 22:56:31.632043   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050588-14520", Name:"nginx-5d56d6b95f", UID:"01cb1d9d-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"692", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5d56d6b95f-7rnk7
I1205 22:56:31.824] apps.sh:147: Successful get deployment nginx {{.metadata.name}}: nginx
I1205 22:56:35.991] (BSuccessful
I1205 22:56:35.991] message:Error from server (Conflict): error when applying patch:
I1205 22:56:35.992] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1544050588-14520\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I1205 22:56:35.992] to:
I1205 22:56:35.992] Resource: "extensions/v1beta1, Resource=deployments", GroupVersionKind: "extensions/v1beta1, Kind=Deployment"
I1205 22:56:35.992] Name: "nginx", Namespace: "namespace-1544050588-14520"
I1205 22:56:35.993] Object: &{map["metadata":map["annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1544050588-14520\"},\"spec\":{\"replicas\":3,\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "name":"nginx" "namespace":"namespace-1544050588-14520" "selfLink":"/apis/extensions/v1beta1/namespaces/namespace-1544050588-14520/deployments/nginx" "resourceVersion":"704" "uid":"01ca7ce6-f8e1-11e8-8d22-0242ac110002" "generation":'\x01' "creationTimestamp":"2018-12-05T22:56:31Z" "labels":map["name":"nginx"]] "spec":map["replicas":'\x03' "selector":map["matchLabels":map["name":"nginx1"]] "template":map["metadata":map["labels":map["name":"nginx1"] "creationTimestamp":<nil>] "spec":map["containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "ports":[map["containerPort":'P' "protocol":"TCP"]] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File" "imagePullPolicy":"IfNotPresent" "name":"nginx"]] "restartPolicy":"Always" "terminationGracePeriodSeconds":'\x1e' "dnsPolicy":"ClusterFirst" "securityContext":map[] "schedulerName":"default-scheduler"]] "strategy":map["type":"RollingUpdate" "rollingUpdate":map["maxUnavailable":'\x01' "maxSurge":'\x01']] "revisionHistoryLimit":%!q(int64=+2147483647) "progressDeadlineSeconds":%!q(int64=+2147483647)] "status":map["updatedReplicas":'\x03' "unavailableReplicas":'\x03' "conditions":[map["reason":"MinimumReplicasUnavailable" "message":"Deployment does not have minimum availability." "type":"Available" "status":"False" "lastUpdateTime":"2018-12-05T22:56:31Z" "lastTransitionTime":"2018-12-05T22:56:31Z"]] "observedGeneration":'\x01' "replicas":'\x03'] "kind":"Deployment" "apiVersion":"extensions/v1beta1"]}
I1205 22:56:35.993] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.extensions "nginx": the object has been modified; please apply your changes to the latest version and try again
I1205 22:56:35.994] has:Error from server (Conflict)
W1205 22:56:40.200] E1205 22:56:40.199327   55672 replica_set.go:450] Sync "namespace-1544050588-14520/nginx-5d56d6b95f" failed with Operation cannot be fulfilled on replicasets.apps "nginx-5d56d6b95f": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1544050588-14520/nginx-5d56d6b95f, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 01cb1d9d-f8e1-11e8-8d22-0242ac110002, UID in object meta: 
I1205 22:56:41.183] deployment.extensions/nginx configured
I1205 22:56:41.266] Successful
I1205 22:56:41.267] message:        "name": "nginx2"
I1205 22:56:41.267]           "name": "nginx2"
I1205 22:56:41.267] has:"name": "nginx2"
W1205 22:56:41.367] I1205 22:56:41.186264   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050588-14520", Name:"nginx", UID:"077dc4e0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"728", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7777658b9d to 3
... skipping 82 lines ...
I1205 22:56:47.561] +++ [1205 22:56:47] Creating namespace namespace-1544050607-2320
I1205 22:56:47.624] namespace/namespace-1544050607-2320 created
I1205 22:56:47.686] Context "test" modified.
I1205 22:56:47.692] +++ [1205 22:56:47] Testing kubectl get
I1205 22:56:47.773] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:47.849] (BSuccessful
I1205 22:56:47.849] message:Error from server (NotFound): pods "abc" not found
I1205 22:56:47.849] has:pods "abc" not found
I1205 22:56:47.928] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:48.003] (BSuccessful
I1205 22:56:48.003] message:Error from server (NotFound): pods "abc" not found
I1205 22:56:48.003] has:pods "abc" not found
I1205 22:56:48.082] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:48.157] (BSuccessful
I1205 22:56:48.157] message:{
I1205 22:56:48.158]     "apiVersion": "v1",
I1205 22:56:48.158]     "items": [],
... skipping 23 lines ...
I1205 22:56:48.459] has not:No resources found
I1205 22:56:48.533] Successful
I1205 22:56:48.534] message:NAME
I1205 22:56:48.534] has not:No resources found
I1205 22:56:48.611] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:48.713] (BSuccessful
I1205 22:56:48.713] message:error: the server doesn't have a resource type "foobar"
I1205 22:56:48.713] has not:No resources found
I1205 22:56:48.786] Successful
I1205 22:56:48.786] message:No resources found.
I1205 22:56:48.787] has:No resources found
I1205 22:56:48.858] Successful
I1205 22:56:48.859] message:
I1205 22:56:48.859] has not:No resources found
I1205 22:56:48.934] Successful
I1205 22:56:48.935] message:No resources found.
I1205 22:56:48.935] has:No resources found
I1205 22:56:49.017] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:49.093] (BSuccessful
I1205 22:56:49.093] message:Error from server (NotFound): pods "abc" not found
I1205 22:56:49.094] has:pods "abc" not found
I1205 22:56:49.095] FAIL!
I1205 22:56:49.095] message:Error from server (NotFound): pods "abc" not found
I1205 22:56:49.095] has not:List
I1205 22:56:49.095] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
I1205 22:56:49.197] Successful
I1205 22:56:49.197] message:I1205 22:56:49.150226   67901 loader.go:359] Config loaded from file /tmp/tmp.vhGew0eJn9/.kube/config
I1205 22:56:49.197] I1205 22:56:49.150706   67901 loader.go:359] Config loaded from file /tmp/tmp.vhGew0eJn9/.kube/config
I1205 22:56:49.198] I1205 22:56:49.151954   67901 round_trippers.go:438] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 0 milliseconds
... skipping 995 lines ...
I1205 22:56:52.532] }
I1205 22:56:52.621] get.sh:155: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1205 22:56:52.842] (B<no value>Successful
I1205 22:56:52.842] message:valid-pod:
I1205 22:56:52.842] has:valid-pod:
I1205 22:56:52.918] Successful
I1205 22:56:52.918] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I1205 22:56:52.919] 	template was:
I1205 22:56:52.919] 		{.missing}
I1205 22:56:52.919] 	object given to jsonpath engine was:
I1205 22:56:52.920] 		map[string]interface {}{"metadata":map[string]interface {}{"name":"valid-pod", "namespace":"namespace-1544050612-14575", "selfLink":"/api/v1/namespaces/namespace-1544050612-14575/pods/valid-pod", "uid":"0e35344d-f8e1-11e8-8d22-0242ac110002", "resourceVersion":"799", "creationTimestamp":"2018-12-05T22:56:52Z", "labels":map[string]interface {}{"name":"valid-pod"}}, "spec":map[string]interface {}{"enableServiceLinks":true, "containers":[]interface {}{map[string]interface {}{"terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "image":"k8s.gcr.io/serve_hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}}}, "restartPolicy":"Always", "terminationGracePeriodSeconds":30, "dnsPolicy":"ClusterFirst", "securityContext":map[string]interface {}{}, "schedulerName":"default-scheduler", "priority":0}, "status":map[string]interface {}{"qosClass":"Guaranteed", "phase":"Pending"}, "kind":"Pod", "apiVersion":"v1"}
I1205 22:56:52.920] has:missing is not found
I1205 22:56:52.993] Successful
I1205 22:56:52.993] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I1205 22:56:52.993] 	template was:
I1205 22:56:52.993] 		{{.missing}}
I1205 22:56:52.993] 	raw data was:
I1205 22:56:52.994] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2018-12-05T22:56:52Z","labels":{"name":"valid-pod"},"name":"valid-pod","namespace":"namespace-1544050612-14575","resourceVersion":"799","selfLink":"/api/v1/namespaces/namespace-1544050612-14575/pods/valid-pod","uid":"0e35344d-f8e1-11e8-8d22-0242ac110002"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I1205 22:56:52.994] 	object given to template engine was:
I1205 22:56:52.995] 		map[spec:map[terminationGracePeriodSeconds:30 containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[]] status:map[phase:Pending qosClass:Guaranteed] apiVersion:v1 kind:Pod metadata:map[selfLink:/api/v1/namespaces/namespace-1544050612-14575/pods/valid-pod uid:0e35344d-f8e1-11e8-8d22-0242ac110002 creationTimestamp:2018-12-05T22:56:52Z labels:map[name:valid-pod] name:valid-pod namespace:namespace-1544050612-14575 resourceVersion:799]]
I1205 22:56:52.995] has:map has no entry for key "missing"
W1205 22:56:53.095] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
W1205 22:56:54.064] E1205 22:56:54.063175   68290 streamwatcher.go:109] Unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)
I1205 22:56:54.164] Successful
I1205 22:56:54.164] message:NAME        READY   STATUS    RESTARTS   AGE
I1205 22:56:54.164] valid-pod   0/1     Pending   0          1s
I1205 22:56:54.165] has:STATUS
I1205 22:56:54.165] Successful
... skipping 80 lines ...
I1205 22:56:56.326]   terminationGracePeriodSeconds: 30
I1205 22:56:56.326] status:
I1205 22:56:56.326]   phase: Pending
I1205 22:56:56.326]   qosClass: Guaranteed
I1205 22:56:56.326] has:name: valid-pod
I1205 22:56:56.326] Successful
I1205 22:56:56.327] message:Error from server (NotFound): pods "invalid-pod" not found
I1205 22:56:56.327] has:"invalid-pod" not found
I1205 22:56:56.375] pod "valid-pod" deleted
I1205 22:56:56.458] get.sh:193: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:56:56.591] (Bpod/redis-master created
I1205 22:56:56.595] pod/valid-pod created
I1205 22:56:56.674] Successful
... skipping 305 lines ...
I1205 22:57:00.377] Running command: run_create_secret_tests
I1205 22:57:00.393] 
I1205 22:57:00.395] +++ Running case: test-cmd.run_create_secret_tests 
I1205 22:57:00.398] +++ working dir: /go/src/k8s.io/kubernetes
I1205 22:57:00.399] +++ command: run_create_secret_tests
I1205 22:57:00.478] Successful
I1205 22:57:00.479] message:Error from server (NotFound): secrets "mysecret" not found
I1205 22:57:00.479] has:secrets "mysecret" not found
W1205 22:57:00.579] I1205 22:56:59.637754   52321 clientconn.go:551] parsed scheme: ""
W1205 22:57:00.579] I1205 22:56:59.637785   52321 clientconn.go:557] scheme "" not registered, fallback to default scheme
W1205 22:57:00.579] I1205 22:56:59.637825   52321 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W1205 22:57:00.580] I1205 22:56:59.637863   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:57:00.580] I1205 22:56:59.638241   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:57:00.580] No resources found.
W1205 22:57:00.580] No resources found.
I1205 22:57:00.680] Successful
I1205 22:57:00.681] message:Error from server (NotFound): secrets "mysecret" not found
I1205 22:57:00.681] has:secrets "mysecret" not found
I1205 22:57:00.681] Successful
I1205 22:57:00.681] message:user-specified
I1205 22:57:00.681] has:user-specified
I1205 22:57:00.683] Successful
I1205 22:57:00.748] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-create-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-create-cm","uid":"1326ce13-f8e1-11e8-8d22-0242ac110002","resourceVersion":"873","creationTimestamp":"2018-12-05T22:57:00Z"}}
... skipping 80 lines ...
I1205 22:57:02.559] has:Timeout exceeded while reading body
I1205 22:57:02.635] Successful
I1205 22:57:02.635] message:NAME        READY   STATUS    RESTARTS   AGE
I1205 22:57:02.636] valid-pod   0/1     Pending   0          1s
I1205 22:57:02.636] has:valid-pod
I1205 22:57:02.699] Successful
I1205 22:57:02.699] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I1205 22:57:02.699] has:Invalid timeout value
I1205 22:57:02.772] pod "valid-pod" deleted
I1205 22:57:02.790] +++ exit code: 0
I1205 22:57:02.821] Recording: run_crd_tests
I1205 22:57:02.821] Running command: run_crd_tests
I1205 22:57:02.839] 
... skipping 166 lines ...
I1205 22:57:06.739] foo.company.com/test patched
I1205 22:57:06.825] crd.sh:237: Successful get foos/test {{.patched}}: value1
I1205 22:57:06.899] (Bfoo.company.com/test patched
I1205 22:57:06.983] crd.sh:239: Successful get foos/test {{.patched}}: value2
I1205 22:57:07.058] (Bfoo.company.com/test patched
I1205 22:57:07.141] crd.sh:241: Successful get foos/test {{.patched}}: <no value>
I1205 22:57:07.280] (B+++ [1205 22:57:07] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I1205 22:57:07.336] {
I1205 22:57:07.336]     "apiVersion": "company.com/v1",
I1205 22:57:07.336]     "kind": "Foo",
I1205 22:57:07.336]     "metadata": {
I1205 22:57:07.336]         "annotations": {
I1205 22:57:07.336]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 113 lines ...
W1205 22:57:08.772] I1205 22:57:05.210846   52321 controller.go:608] quota admission added evaluator for: foos.company.com
W1205 22:57:08.773] I1205 22:57:08.422732   52321 controller.go:608] quota admission added evaluator for: bars.company.com
W1205 22:57:08.773] /go/src/k8s.io/kubernetes/hack/lib/test.sh: line 264: 70842 Killed                  while [ ${tries} -lt 10 ]; do
W1205 22:57:08.773]     tries=$((tries+1)); kubectl "${kube_flags[@]}" patch bars/test -p "{\"patched\":\"${tries}\"}" --type=merge; sleep 1;
W1205 22:57:08.773] done
W1205 22:57:08.773] /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/crd.sh: line 295: 70841 Killed                  kubectl "${kube_flags[@]}" get bars --request-timeout=1m --watch-only -o name
W1205 22:57:21.193] E1205 22:57:21.192660   55672 resource_quota_controller.go:437] failed to sync resource monitors: [couldn't start monitor for resource "company.com/v1, Resource=validfoos": unable to monitor quota for resource "company.com/v1, Resource=validfoos", couldn't start monitor for resource "company.com/v1, Resource=foos": unable to monitor quota for resource "company.com/v1, Resource=foos", couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies", couldn't start monitor for resource "company.com/v1, Resource=bars": unable to monitor quota for resource "company.com/v1, Resource=bars", couldn't start monitor for resource "mygroup.example.com/v1alpha1, Resource=resources": unable to monitor quota for resource "mygroup.example.com/v1alpha1, Resource=resources"]
W1205 22:57:21.366] I1205 22:57:21.365396   55672 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1205 22:57:21.367] I1205 22:57:21.366800   52321 clientconn.go:551] parsed scheme: ""
W1205 22:57:21.367] I1205 22:57:21.366831   52321 clientconn.go:557] scheme "" not registered, fallback to default scheme
W1205 22:57:21.367] I1205 22:57:21.366880   52321 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W1205 22:57:21.367] I1205 22:57:21.366952   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:57:21.368] I1205 22:57:21.367452   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 81 lines ...
I1205 22:57:32.893] +++ [1205 22:57:32] Testing cmd with image
I1205 22:57:32.978] Successful
I1205 22:57:32.979] message:deployment.apps/test1 created
I1205 22:57:32.979] has:deployment.apps/test1 created
I1205 22:57:33.050] deployment.extensions "test1" deleted
I1205 22:57:33.120] Successful
I1205 22:57:33.120] message:error: Invalid image name "InvalidImageName": invalid reference format
I1205 22:57:33.121] has:error: Invalid image name "InvalidImageName": invalid reference format
I1205 22:57:33.134] +++ exit code: 0
I1205 22:57:33.164] Recording: run_recursive_resources_tests
I1205 22:57:33.164] Running command: run_recursive_resources_tests
I1205 22:57:33.182] 
I1205 22:57:33.183] +++ Running case: test-cmd.run_recursive_resources_tests 
I1205 22:57:33.185] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 4 lines ...
I1205 22:57:33.331] Context "test" modified.
I1205 22:57:33.415] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:33.653] (Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:33.655] (BSuccessful
I1205 22:57:33.655] message:pod/busybox0 created
I1205 22:57:33.655] pod/busybox1 created
I1205 22:57:33.656] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1205 22:57:33.656] has:error validating data: kind not set
I1205 22:57:33.738] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:33.902] (Bgeneric-resources.sh:219: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I1205 22:57:33.904] (BSuccessful
I1205 22:57:33.904] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:33.904] has:Object 'Kind' is missing
I1205 22:57:33.988] generic-resources.sh:226: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:34.223] (Bgeneric-resources.sh:230: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I1205 22:57:34.225] (BSuccessful
I1205 22:57:34.225] message:pod/busybox0 replaced
I1205 22:57:34.225] pod/busybox1 replaced
I1205 22:57:34.226] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1205 22:57:34.226] has:error validating data: kind not set
I1205 22:57:34.312] generic-resources.sh:235: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:34.400] (BSuccessful
I1205 22:57:34.400] message:Name:               busybox0
I1205 22:57:34.400] Namespace:          namespace-1544050653-13082
I1205 22:57:34.400] Priority:           0
I1205 22:57:34.401] PriorityClassName:  <none>
... skipping 159 lines ...
I1205 22:57:34.417] has:Object 'Kind' is missing
I1205 22:57:34.492] generic-resources.sh:245: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:34.663] (Bgeneric-resources.sh:249: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I1205 22:57:34.665] (BSuccessful
I1205 22:57:34.665] message:pod/busybox0 annotated
I1205 22:57:34.665] pod/busybox1 annotated
I1205 22:57:34.665] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:34.665] has:Object 'Kind' is missing
I1205 22:57:34.749] generic-resources.sh:254: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:34.997] (Bgeneric-resources.sh:258: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I1205 22:57:34.998] (BSuccessful
I1205 22:57:34.999] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I1205 22:57:34.999] pod/busybox0 configured
I1205 22:57:34.999] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I1205 22:57:34.999] pod/busybox1 configured
I1205 22:57:34.999] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1205 22:57:34.999] has:error validating data: kind not set
I1205 22:57:35.079] generic-resources.sh:264: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:35.217] (Bdeployment.extensions/nginx created
I1205 22:57:35.309] generic-resources.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I1205 22:57:35.393] (Bgeneric-resources.sh:269: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1205 22:57:35.549] (Bgeneric-resources.sh:273: Successful get deployment nginx {{ .apiVersion }}: extensions/v1beta1
I1205 22:57:35.551] (BSuccessful
... skipping 42 lines ...
I1205 22:57:35.624] deployment.extensions "nginx" deleted
I1205 22:57:35.714] generic-resources.sh:280: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:35.867] (Bgeneric-resources.sh:284: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:35.869] (BSuccessful
I1205 22:57:35.869] message:kubectl convert is DEPRECATED and will be removed in a future version.
I1205 22:57:35.869] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I1205 22:57:35.869] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:35.869] has:Object 'Kind' is missing
I1205 22:57:35.955] generic-resources.sh:289: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:36.030] (BSuccessful
I1205 22:57:36.030] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:36.031] has:busybox0:busybox1:
I1205 22:57:36.032] Successful
I1205 22:57:36.032] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:36.032] has:Object 'Kind' is missing
I1205 22:57:36.117] generic-resources.sh:298: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:36.199] (Bpod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:36.281] generic-resources.sh:303: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I1205 22:57:36.283] (BSuccessful
I1205 22:57:36.284] message:pod/busybox0 labeled
I1205 22:57:36.284] pod/busybox1 labeled
I1205 22:57:36.284] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:36.284] has:Object 'Kind' is missing
I1205 22:57:36.370] generic-resources.sh:308: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:36.448] (Bpod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:36.531] generic-resources.sh:313: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I1205 22:57:36.533] (BSuccessful
I1205 22:57:36.533] message:pod/busybox0 patched
I1205 22:57:36.533] pod/busybox1 patched
I1205 22:57:36.534] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:36.534] has:Object 'Kind' is missing
I1205 22:57:36.619] generic-resources.sh:318: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:36.785] (Bgeneric-resources.sh:322: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:36.787] (BSuccessful
I1205 22:57:36.787] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I1205 22:57:36.788] pod "busybox0" force deleted
I1205 22:57:36.788] pod "busybox1" force deleted
I1205 22:57:36.788] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1205 22:57:36.788] has:Object 'Kind' is missing
I1205 22:57:36.869] generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:37.008] (Breplicationcontroller/busybox0 created
I1205 22:57:37.011] replicationcontroller/busybox1 created
I1205 22:57:37.103] generic-resources.sh:331: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:37.189] (Bgeneric-resources.sh:336: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:37.271] (Bgeneric-resources.sh:337: Successful get rc busybox0 {{.spec.replicas}}: 1
I1205 22:57:37.353] (Bgeneric-resources.sh:338: Successful get rc busybox1 {{.spec.replicas}}: 1
I1205 22:57:37.519] (Bgeneric-resources.sh:343: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I1205 22:57:37.599] (Bgeneric-resources.sh:344: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I1205 22:57:37.601] (BSuccessful
I1205 22:57:37.601] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I1205 22:57:37.601] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I1205 22:57:37.601] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:37.601] has:Object 'Kind' is missing
I1205 22:57:37.674] horizontalpodautoscaler.autoscaling "busybox0" deleted
I1205 22:57:37.751] horizontalpodautoscaler.autoscaling "busybox1" deleted
I1205 22:57:37.842] generic-resources.sh:352: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:37.925] (Bgeneric-resources.sh:353: Successful get rc busybox0 {{.spec.replicas}}: 1
I1205 22:57:38.007] (Bgeneric-resources.sh:354: Successful get rc busybox1 {{.spec.replicas}}: 1
I1205 22:57:38.178] (Bgeneric-resources.sh:358: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I1205 22:57:38.261] (Bgeneric-resources.sh:359: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I1205 22:57:38.264] (BSuccessful
I1205 22:57:38.264] message:service/busybox0 exposed
I1205 22:57:38.264] service/busybox1 exposed
I1205 22:57:38.264] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:38.265] has:Object 'Kind' is missing
I1205 22:57:38.350] generic-resources.sh:365: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:38.432] (Bgeneric-resources.sh:366: Successful get rc busybox0 {{.spec.replicas}}: 1
I1205 22:57:38.514] (Bgeneric-resources.sh:367: Successful get rc busybox1 {{.spec.replicas}}: 1
I1205 22:57:38.693] (Bgeneric-resources.sh:371: Successful get rc busybox0 {{.spec.replicas}}: 2
I1205 22:57:38.775] (Bgeneric-resources.sh:372: Successful get rc busybox1 {{.spec.replicas}}: 2
I1205 22:57:38.777] (BSuccessful
I1205 22:57:38.777] message:replicationcontroller/busybox0 scaled
I1205 22:57:38.777] replicationcontroller/busybox1 scaled
I1205 22:57:38.778] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:38.778] has:Object 'Kind' is missing
I1205 22:57:38.860] generic-resources.sh:377: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1205 22:57:39.024] (Bgeneric-resources.sh:381: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:39.026] (BSuccessful
I1205 22:57:39.026] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I1205 22:57:39.027] replicationcontroller "busybox0" force deleted
I1205 22:57:39.027] replicationcontroller "busybox1" force deleted
I1205 22:57:39.027] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:39.027] has:Object 'Kind' is missing
I1205 22:57:39.108] generic-resources.sh:386: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:39.249] (Bdeployment.extensions/nginx1-deployment created
I1205 22:57:39.252] deployment.extensions/nginx0-deployment created
I1205 22:57:39.348] generic-resources.sh:390: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I1205 22:57:39.430] (Bgeneric-resources.sh:391: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I1205 22:57:39.619] (Bgeneric-resources.sh:395: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I1205 22:57:39.621] (BSuccessful
I1205 22:57:39.621] message:deployment.extensions/nginx1-deployment skipped rollback (current template already matches revision 1)
I1205 22:57:39.621] deployment.extensions/nginx0-deployment skipped rollback (current template already matches revision 1)
I1205 22:57:39.621] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1205 22:57:39.622] has:Object 'Kind' is missing
I1205 22:57:39.701] deployment.extensions/nginx1-deployment paused
I1205 22:57:39.704] deployment.extensions/nginx0-deployment paused
I1205 22:57:39.796] generic-resources.sh:402: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I1205 22:57:39.798] (BSuccessful
I1205 22:57:39.799] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1205 22:57:39.799] has:Object 'Kind' is missing
I1205 22:57:39.880] deployment.extensions/nginx1-deployment resumed
I1205 22:57:39.882] deployment.extensions/nginx0-deployment resumed
I1205 22:57:39.977] generic-resources.sh:408: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
I1205 22:57:39.979] (BSuccessful
I1205 22:57:39.979] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1205 22:57:39.980] has:Object 'Kind' is missing
W1205 22:57:40.080] Error from server (NotFound): namespaces "non-native-resources" not found
W1205 22:57:40.080] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W1205 22:57:40.081] I1205 22:57:32.968359   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050652-14266", Name:"test1", UID:"265b0467-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"984", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-fb488bd5d to 1
W1205 22:57:40.081] I1205 22:57:32.972907   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050652-14266", Name:"test1-fb488bd5d", UID:"265b880e-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"985", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-fb488bd5d-n89wp
W1205 22:57:40.081] I1205 22:57:35.220841   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050653-13082", Name:"nginx", UID:"27b2a6a3-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1009", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-6f6bb85d9c to 3
W1205 22:57:40.081] I1205 22:57:35.223290   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050653-13082", Name:"nginx-6f6bb85d9c", UID:"27b334ac-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1010", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-6f6bb85d9c-2fkcb
W1205 22:57:40.082] I1205 22:57:35.225039   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050653-13082", Name:"nginx-6f6bb85d9c", UID:"27b334ac-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1010", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-6f6bb85d9c-n5b6w
W1205 22:57:40.082] I1205 22:57:35.225914   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050653-13082", Name:"nginx-6f6bb85d9c", UID:"27b334ac-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1010", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-6f6bb85d9c-kgwsk
W1205 22:57:40.082] kubectl convert is DEPRECATED and will be removed in a future version.
W1205 22:57:40.082] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W1205 22:57:40.082] I1205 22:57:37.010430   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050653-13082", Name:"busybox0", UID:"28c3c604-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1040", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-wpncv
W1205 22:57:40.082] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W1205 22:57:40.083] I1205 22:57:37.014195   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050653-13082", Name:"busybox1", UID:"28c48860-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1042", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-lzxcj
W1205 22:57:40.083] I1205 22:57:37.134742   55672 namespace_controller.go:171] Namespace has been deleted non-native-resources
W1205 22:57:40.083] I1205 22:57:38.601678   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050653-13082", Name:"busybox0", UID:"28c3c604-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1062", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-8j46z
W1205 22:57:40.083] I1205 22:57:38.608888   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050653-13082", Name:"busybox1", UID:"28c48860-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1066", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-pr9q6
W1205 22:57:40.084] I1205 22:57:39.251986   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050653-13082", Name:"nginx1-deployment", UID:"2a19d1e8-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1082", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-75f6fc6747 to 2
W1205 22:57:40.084] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W1205 22:57:40.084] I1205 22:57:39.254466   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050653-13082", Name:"nginx1-deployment-75f6fc6747", UID:"2a1a5760-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1083", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-75f6fc6747-nr6ql
W1205 22:57:40.084] I1205 22:57:39.256884   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050653-13082", Name:"nginx1-deployment-75f6fc6747", UID:"2a1a5760-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1083", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-75f6fc6747-9wwkm
W1205 22:57:40.085] I1205 22:57:39.256963   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050653-13082", Name:"nginx0-deployment", UID:"2a1a7913-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1084", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-b6bb4ccbb to 2
W1205 22:57:40.085] I1205 22:57:39.260653   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050653-13082", Name:"nginx0-deployment-b6bb4ccbb", UID:"2a1b0555-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1088", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-b6bb4ccbb-crw6x
W1205 22:57:40.085] I1205 22:57:39.262916   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050653-13082", Name:"nginx0-deployment-b6bb4ccbb", UID:"2a1b0555-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1088", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-b6bb4ccbb-xxjds
W1205 22:57:40.153] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1205 22:57:40.166] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1205 22:57:40.266] Successful
I1205 22:57:40.267] message:deployment.extensions/nginx1-deployment 
I1205 22:57:40.267] REVISION  CHANGE-CAUSE
I1205 22:57:40.267] 1         <none>
I1205 22:57:40.267] 
I1205 22:57:40.267] deployment.extensions/nginx0-deployment 
I1205 22:57:40.267] REVISION  CHANGE-CAUSE
I1205 22:57:40.267] 1         <none>
I1205 22:57:40.267] 
I1205 22:57:40.268] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1205 22:57:40.268] has:nginx0-deployment
I1205 22:57:40.268] Successful
I1205 22:57:40.268] message:deployment.extensions/nginx1-deployment 
I1205 22:57:40.268] REVISION  CHANGE-CAUSE
I1205 22:57:40.268] 1         <none>
I1205 22:57:40.268] 
I1205 22:57:40.268] deployment.extensions/nginx0-deployment 
I1205 22:57:40.268] REVISION  CHANGE-CAUSE
I1205 22:57:40.268] 1         <none>
I1205 22:57:40.268] 
I1205 22:57:40.269] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1205 22:57:40.269] has:nginx1-deployment
I1205 22:57:40.269] Successful
I1205 22:57:40.269] message:deployment.extensions/nginx1-deployment 
I1205 22:57:40.269] REVISION  CHANGE-CAUSE
I1205 22:57:40.269] 1         <none>
I1205 22:57:40.269] 
I1205 22:57:40.269] deployment.extensions/nginx0-deployment 
I1205 22:57:40.269] REVISION  CHANGE-CAUSE
I1205 22:57:40.269] 1         <none>
I1205 22:57:40.269] 
I1205 22:57:40.270] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1205 22:57:40.270] has:Object 'Kind' is missing
I1205 22:57:40.270] deployment.extensions "nginx1-deployment" force deleted
I1205 22:57:40.270] deployment.extensions "nginx0-deployment" force deleted
I1205 22:57:41.255] generic-resources.sh:424: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:41.394] (Breplicationcontroller/busybox0 created
I1205 22:57:41.397] replicationcontroller/busybox1 created
... skipping 7 lines ...
I1205 22:57:41.579] message:no rollbacker has been implemented for "ReplicationController"
I1205 22:57:41.580] no rollbacker has been implemented for "ReplicationController"
I1205 22:57:41.580] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:41.580] has:Object 'Kind' is missing
I1205 22:57:41.663] Successful
I1205 22:57:41.663] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:41.663] error: replicationcontrollers "busybox0" pausing is not supported
I1205 22:57:41.664] error: replicationcontrollers "busybox1" pausing is not supported
I1205 22:57:41.664] has:Object 'Kind' is missing
I1205 22:57:41.665] Successful
I1205 22:57:41.665] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:41.665] error: replicationcontrollers "busybox0" pausing is not supported
I1205 22:57:41.665] error: replicationcontrollers "busybox1" pausing is not supported
I1205 22:57:41.665] has:replicationcontrollers "busybox0" pausing is not supported
I1205 22:57:41.667] Successful
I1205 22:57:41.667] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:41.667] error: replicationcontrollers "busybox0" pausing is not supported
I1205 22:57:41.668] error: replicationcontrollers "busybox1" pausing is not supported
I1205 22:57:41.668] has:replicationcontrollers "busybox1" pausing is not supported
I1205 22:57:41.753] Successful
I1205 22:57:41.753] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:41.754] error: replicationcontrollers "busybox0" resuming is not supported
I1205 22:57:41.754] error: replicationcontrollers "busybox1" resuming is not supported
I1205 22:57:41.754] has:Object 'Kind' is missing
I1205 22:57:41.754] Successful
I1205 22:57:41.755] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:41.755] error: replicationcontrollers "busybox0" resuming is not supported
I1205 22:57:41.755] error: replicationcontrollers "busybox1" resuming is not supported
I1205 22:57:41.755] has:replicationcontrollers "busybox0" resuming is not supported
I1205 22:57:41.756] Successful
I1205 22:57:41.756] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:41.757] error: replicationcontrollers "busybox0" resuming is not supported
I1205 22:57:41.757] error: replicationcontrollers "busybox1" resuming is not supported
I1205 22:57:41.757] has:replicationcontrollers "busybox0" resuming is not supported
I1205 22:57:41.828] replicationcontroller "busybox0" force deleted
I1205 22:57:41.832] replicationcontroller "busybox1" force deleted
W1205 22:57:41.932] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W1205 22:57:41.933] I1205 22:57:41.397309   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050653-13082", Name:"busybox0", UID:"2b611e07-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1127", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-b88wp
W1205 22:57:41.933] I1205 22:57:41.399692   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050653-13082", Name:"busybox1", UID:"2b61c22d-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1129", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-dx77g
W1205 22:57:41.933] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1205 22:57:41.934] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1205 22:57:42.854] +++ exit code: 0
I1205 22:57:42.884] Recording: run_namespace_tests
I1205 22:57:42.885] Running command: run_namespace_tests
I1205 22:57:42.904] 
I1205 22:57:42.906] +++ Running case: test-cmd.run_namespace_tests 
I1205 22:57:42.908] +++ working dir: /go/src/k8s.io/kubernetes
I1205 22:57:42.910] +++ command: run_namespace_tests
I1205 22:57:42.919] +++ [1205 22:57:42] Testing kubectl(v1:namespaces)
I1205 22:57:42.984] namespace/my-namespace created
I1205 22:57:43.070] core.sh:1295: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I1205 22:57:43.142] (Bnamespace "my-namespace" deleted
I1205 22:57:48.252] namespace/my-namespace condition met
I1205 22:57:48.330] Successful
I1205 22:57:48.330] message:Error from server (NotFound): namespaces "my-namespace" not found
I1205 22:57:48.330] has: not found
I1205 22:57:48.435] core.sh:1310: Successful get namespaces {{range.items}}{{ if eq $id_field \"other\" }}found{{end}}{{end}}:: :
I1205 22:57:48.500] (Bnamespace/other created
I1205 22:57:48.583] core.sh:1314: Successful get namespaces/other {{.metadata.name}}: other
I1205 22:57:48.663] (Bcore.sh:1318: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:48.798] (Bpod/valid-pod created
I1205 22:57:48.887] core.sh:1322: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1205 22:57:48.968] (Bcore.sh:1324: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1205 22:57:49.043] (BSuccessful
I1205 22:57:49.043] message:error: a resource cannot be retrieved by name across all namespaces
I1205 22:57:49.043] has:a resource cannot be retrieved by name across all namespaces
I1205 22:57:49.122] core.sh:1331: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1205 22:57:49.195] (Bpod "valid-pod" force deleted
I1205 22:57:49.279] core.sh:1335: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:57:49.348] (Bnamespace "other" deleted
W1205 22:57:49.448] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1205 22:57:51.199] E1205 22:57:51.198326   55672 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W1205 22:57:51.489] I1205 22:57:51.488377   55672 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1205 22:57:51.589] I1205 22:57:51.588720   55672 controller_utils.go:1034] Caches are synced for garbage collector controller
W1205 22:57:52.430] I1205 22:57:52.429713   55672 horizontal.go:309] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1544050653-13082
W1205 22:57:52.434] I1205 22:57:52.433783   55672 horizontal.go:309] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1544050653-13082
W1205 22:57:53.249] I1205 22:57:53.248854   55672 namespace_controller.go:171] Namespace has been deleted my-namespace
I1205 22:57:54.467] +++ exit code: 0
... skipping 113 lines ...
I1205 22:58:09.178] +++ command: run_client_config_tests
I1205 22:58:09.188] +++ [1205 22:58:09] Creating namespace namespace-1544050689-20606
I1205 22:58:09.252] namespace/namespace-1544050689-20606 created
I1205 22:58:09.318] Context "test" modified.
I1205 22:58:09.324] +++ [1205 22:58:09] Testing client config
I1205 22:58:09.385] Successful
I1205 22:58:09.386] message:error: stat missing: no such file or directory
I1205 22:58:09.386] has:missing: no such file or directory
I1205 22:58:09.446] Successful
I1205 22:58:09.446] message:error: stat missing: no such file or directory
I1205 22:58:09.446] has:missing: no such file or directory
I1205 22:58:09.506] Successful
I1205 22:58:09.507] message:error: stat missing: no such file or directory
I1205 22:58:09.507] has:missing: no such file or directory
I1205 22:58:09.567] Successful
I1205 22:58:09.568] message:Error in configuration: context was not found for specified context: missing-context
I1205 22:58:09.568] has:context was not found for specified context: missing-context
I1205 22:58:09.628] Successful
I1205 22:58:09.628] message:error: no server found for cluster "missing-cluster"
I1205 22:58:09.629] has:no server found for cluster "missing-cluster"
I1205 22:58:09.689] Successful
I1205 22:58:09.689] message:error: auth info "missing-user" does not exist
I1205 22:58:09.689] has:auth info "missing-user" does not exist
I1205 22:58:09.811] Successful
I1205 22:58:09.811] message:error: Error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I1205 22:58:09.812] has:Error loading config file
I1205 22:58:09.872] Successful
I1205 22:58:09.873] message:error: stat missing-config: no such file or directory
I1205 22:58:09.873] has:no such file or directory
I1205 22:58:09.884] +++ exit code: 0
I1205 22:58:09.913] Recording: run_service_accounts_tests
I1205 22:58:09.913] Running command: run_service_accounts_tests
I1205 22:58:09.932] 
I1205 22:58:09.933] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 76 lines ...
I1205 22:58:17.028]                 job-name=test-job
I1205 22:58:17.028]                 run=pi
I1205 22:58:17.028] Annotations:    cronjob.kubernetes.io/instantiate: manual
I1205 22:58:17.028] Parallelism:    1
I1205 22:58:17.028] Completions:    1
I1205 22:58:17.029] Start Time:     Wed, 05 Dec 2018 22:58:16 +0000
I1205 22:58:17.029] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I1205 22:58:17.029] Pod Template:
I1205 22:58:17.029]   Labels:  controller-uid=407943d6-f8e1-11e8-8d22-0242ac110002
I1205 22:58:17.029]            job-name=test-job
I1205 22:58:17.029]            run=pi
I1205 22:58:17.029]   Containers:
I1205 22:58:17.029]    pi:
... skipping 329 lines ...
I1205 22:58:26.266]   selector:
I1205 22:58:26.266]     role: padawan
I1205 22:58:26.266]   sessionAffinity: None
I1205 22:58:26.266]   type: ClusterIP
I1205 22:58:26.266] status:
I1205 22:58:26.266]   loadBalancer: {}
W1205 22:58:26.366] error: you must specify resources by --filename when --local is set.
W1205 22:58:26.366] Example resource specifications include:
W1205 22:58:26.367]    '-f rsrc.yaml'
W1205 22:58:26.367]    '--filename=rsrc.json'
I1205 22:58:26.467] core.sh:886: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
I1205 22:58:26.563] (Bcore.sh:893: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I1205 22:58:26.638] (Bservice "redis-master" deleted
... skipping 93 lines ...
I1205 22:58:32.042] (Bapps.sh:80: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1205 22:58:32.126] (Bapps.sh:81: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I1205 22:58:32.227] (Bdaemonset.extensions/bind rolled back
I1205 22:58:32.316] apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I1205 22:58:32.401] (Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1205 22:58:32.495] (BSuccessful
I1205 22:58:32.496] message:error: unable to find specified revision 1000000 in history
I1205 22:58:32.496] has:unable to find specified revision
I1205 22:58:32.582] apps.sh:89: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I1205 22:58:32.665] (Bapps.sh:90: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1205 22:58:32.762] (Bdaemonset.extensions/bind rolled back
I1205 22:58:32.854] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I1205 22:58:32.934] (Bapps.sh:94: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 22 lines ...
I1205 22:58:34.132] Namespace:    namespace-1544050713-13687
I1205 22:58:34.132] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.132] Labels:       app=guestbook
I1205 22:58:34.132]               tier=frontend
I1205 22:58:34.132] Annotations:  <none>
I1205 22:58:34.133] Replicas:     3 current / 3 desired
I1205 22:58:34.133] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.133] Pod Template:
I1205 22:58:34.133]   Labels:  app=guestbook
I1205 22:58:34.133]            tier=frontend
I1205 22:58:34.133]   Containers:
I1205 22:58:34.133]    php-redis:
I1205 22:58:34.133]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I1205 22:58:34.239] Namespace:    namespace-1544050713-13687
I1205 22:58:34.239] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.239] Labels:       app=guestbook
I1205 22:58:34.239]               tier=frontend
I1205 22:58:34.239] Annotations:  <none>
I1205 22:58:34.240] Replicas:     3 current / 3 desired
I1205 22:58:34.240] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.240] Pod Template:
I1205 22:58:34.240]   Labels:  app=guestbook
I1205 22:58:34.240]            tier=frontend
I1205 22:58:34.240]   Containers:
I1205 22:58:34.240]    php-redis:
I1205 22:58:34.240]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I1205 22:58:34.331] Namespace:    namespace-1544050713-13687
I1205 22:58:34.331] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.332] Labels:       app=guestbook
I1205 22:58:34.332]               tier=frontend
I1205 22:58:34.332] Annotations:  <none>
I1205 22:58:34.332] Replicas:     3 current / 3 desired
I1205 22:58:34.332] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.332] Pod Template:
I1205 22:58:34.332]   Labels:  app=guestbook
I1205 22:58:34.332]            tier=frontend
I1205 22:58:34.332]   Containers:
I1205 22:58:34.332]    php-redis:
I1205 22:58:34.333]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 4 lines ...
I1205 22:58:34.333]       memory:  100Mi
I1205 22:58:34.333]     Environment:
I1205 22:58:34.333]       GET_HOSTS_FROM:  dns
I1205 22:58:34.333]     Mounts:            <none>
I1205 22:58:34.333]   Volumes:             <none>
I1205 22:58:34.333] (B
W1205 22:58:34.434] E1205 22:58:32.233396   55672 daemon_controller.go:303] namespace-1544050710-10719/bind failed with : failed to construct revisions of DaemonSet: Operation cannot be fulfilled on controllerrevisions.apps "bind-7c544bbdd7": the object has been modified; please apply your changes to the latest version and try again
W1205 22:58:34.434] I1205 22:58:33.532745   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4a740e8d-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1354", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-n7tsj
W1205 22:58:34.435] I1205 22:58:33.535520   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4a740e8d-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1354", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-l6k8f
W1205 22:58:34.435] I1205 22:58:33.535768   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4a740e8d-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1354", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bnxv5
W1205 22:58:34.435] I1205 22:58:33.915742   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4aae601a-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1370", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wbpkm
W1205 22:58:34.436] I1205 22:58:33.918787   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4aae601a-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1370", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hs8k7
W1205 22:58:34.436] I1205 22:58:33.918833   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4aae601a-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1370", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ncl5z
... skipping 2 lines ...
I1205 22:58:34.537] Namespace:    namespace-1544050713-13687
I1205 22:58:34.537] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.537] Labels:       app=guestbook
I1205 22:58:34.537]               tier=frontend
I1205 22:58:34.537] Annotations:  <none>
I1205 22:58:34.537] Replicas:     3 current / 3 desired
I1205 22:58:34.538] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.538] Pod Template:
I1205 22:58:34.538]   Labels:  app=guestbook
I1205 22:58:34.538]            tier=frontend
I1205 22:58:34.538]   Containers:
I1205 22:58:34.538]    php-redis:
I1205 22:58:34.538]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I1205 22:58:34.571] Namespace:    namespace-1544050713-13687
I1205 22:58:34.571] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.572] Labels:       app=guestbook
I1205 22:58:34.572]               tier=frontend
I1205 22:58:34.572] Annotations:  <none>
I1205 22:58:34.572] Replicas:     3 current / 3 desired
I1205 22:58:34.572] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.572] Pod Template:
I1205 22:58:34.572]   Labels:  app=guestbook
I1205 22:58:34.572]            tier=frontend
I1205 22:58:34.572]   Containers:
I1205 22:58:34.572]    php-redis:
I1205 22:58:34.573]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I1205 22:58:34.672] Namespace:    namespace-1544050713-13687
I1205 22:58:34.672] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.672] Labels:       app=guestbook
I1205 22:58:34.673]               tier=frontend
I1205 22:58:34.673] Annotations:  <none>
I1205 22:58:34.673] Replicas:     3 current / 3 desired
I1205 22:58:34.673] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.673] Pod Template:
I1205 22:58:34.673]   Labels:  app=guestbook
I1205 22:58:34.673]            tier=frontend
I1205 22:58:34.673]   Containers:
I1205 22:58:34.673]    php-redis:
I1205 22:58:34.673]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I1205 22:58:34.768] Namespace:    namespace-1544050713-13687
I1205 22:58:34.768] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.769] Labels:       app=guestbook
I1205 22:58:34.769]               tier=frontend
I1205 22:58:34.769] Annotations:  <none>
I1205 22:58:34.769] Replicas:     3 current / 3 desired
I1205 22:58:34.769] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.769] Pod Template:
I1205 22:58:34.769]   Labels:  app=guestbook
I1205 22:58:34.769]            tier=frontend
I1205 22:58:34.769]   Containers:
I1205 22:58:34.769]    php-redis:
I1205 22:58:34.770]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I1205 22:58:34.866] Namespace:    namespace-1544050713-13687
I1205 22:58:34.866] Selector:     app=guestbook,tier=frontend
I1205 22:58:34.866] Labels:       app=guestbook
I1205 22:58:34.866]               tier=frontend
I1205 22:58:34.867] Annotations:  <none>
I1205 22:58:34.867] Replicas:     3 current / 3 desired
I1205 22:58:34.867] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:34.867] Pod Template:
I1205 22:58:34.867]   Labels:  app=guestbook
I1205 22:58:34.867]            tier=frontend
I1205 22:58:34.867]   Containers:
I1205 22:58:34.867]    php-redis:
I1205 22:58:34.867]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 22 lines ...
I1205 22:58:35.626] core.sh:1061: Successful get rc frontend {{.spec.replicas}}: 3
I1205 22:58:35.710] (Bcore.sh:1065: Successful get rc frontend {{.spec.replicas}}: 3
I1205 22:58:35.788] (Breplicationcontroller/frontend scaled
I1205 22:58:35.876] core.sh:1069: Successful get rc frontend {{.spec.replicas}}: 2
I1205 22:58:35.952] (Breplicationcontroller "frontend" deleted
W1205 22:58:36.053] I1205 22:58:35.037267   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4aae601a-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1380", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-wbpkm
W1205 22:58:36.053] error: Expected replicas to be 3, was 2
W1205 22:58:36.054] I1205 22:58:35.541574   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4aae601a-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1386", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rnwtw
W1205 22:58:36.054] I1205 22:58:35.793960   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4aae601a-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1391", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-rnwtw
W1205 22:58:36.099] I1205 22:58:36.098836   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"redis-master", UID:"4bfbf53c-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1403", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-h7wct
I1205 22:58:36.200] replicationcontroller/redis-master created
I1205 22:58:36.241] replicationcontroller/redis-slave created
I1205 22:58:36.334] replicationcontroller/redis-master scaled
... skipping 29 lines ...
I1205 22:58:37.719] service "expose-test-deployment" deleted
I1205 22:58:37.810] Successful
I1205 22:58:37.810] message:service/expose-test-deployment exposed
I1205 22:58:37.810] has:service/expose-test-deployment exposed
I1205 22:58:37.884] service "expose-test-deployment" deleted
I1205 22:58:37.968] Successful
I1205 22:58:37.968] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I1205 22:58:37.969] See 'kubectl expose -h' for help and examples
I1205 22:58:37.969] has:invalid deployment: no selectors
I1205 22:58:38.046] Successful
I1205 22:58:38.047] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I1205 22:58:38.047] See 'kubectl expose -h' for help and examples
I1205 22:58:38.047] has:invalid deployment: no selectors
W1205 22:58:38.148] I1205 22:58:37.130465   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment", UID:"4c994d7b-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1457", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-659fc6fb to 3
W1205 22:58:38.148] I1205 22:58:37.133179   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-659fc6fb", UID:"4c99e0f2-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1458", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-659fc6fb-fp6rq
W1205 22:58:38.148] I1205 22:58:37.135506   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-659fc6fb", UID:"4c99e0f2-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1458", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-659fc6fb-8hsc9
W1205 22:58:38.149] I1205 22:58:37.135542   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-659fc6fb", UID:"4c99e0f2-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1458", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-659fc6fb-wjz8v
... skipping 27 lines ...
I1205 22:58:39.917] service "frontend" deleted
I1205 22:58:39.924] service "frontend-2" deleted
I1205 22:58:39.930] service "frontend-3" deleted
I1205 22:58:39.936] service "frontend-4" deleted
I1205 22:58:39.942] service "frontend-5" deleted
I1205 22:58:40.030] Successful
I1205 22:58:40.031] message:error: cannot expose a Node
I1205 22:58:40.031] has:cannot expose
I1205 22:58:40.112] Successful
I1205 22:58:40.112] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I1205 22:58:40.112] has:metadata.name: Invalid value
I1205 22:58:40.197] Successful
I1205 22:58:40.197] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 33 lines ...
I1205 22:58:42.361] horizontalpodautoscaler.autoscaling/frontend autoscaled
I1205 22:58:42.461] core.sh:1237: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I1205 22:58:42.544] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W1205 22:58:42.645] I1205 22:58:41.890895   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4f6f8d05-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1627", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jpgm8
W1205 22:58:42.645] I1205 22:58:41.893973   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4f6f8d05-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1627", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-l76gj
W1205 22:58:42.646] I1205 22:58:41.894031   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050713-13687", Name:"frontend", UID:"4f6f8d05-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"1627", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sk8xk
W1205 22:58:42.646] Error: required flag(s) "max" not set
W1205 22:58:42.646] 
W1205 22:58:42.646] 
W1205 22:58:42.646] Examples:
W1205 22:58:42.646]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W1205 22:58:42.646]   kubectl autoscale deployment foo --min=2 --max=10
W1205 22:58:42.647]   
... skipping 54 lines ...
I1205 22:58:42.881]           limits:
I1205 22:58:42.881]             cpu: 300m
I1205 22:58:42.881]           requests:
I1205 22:58:42.881]             cpu: 300m
I1205 22:58:42.881]       terminationGracePeriodSeconds: 0
I1205 22:58:42.882] status: {}
W1205 22:58:42.982] Error from server (NotFound): deployments.extensions "nginx-deployment-resources" not found
I1205 22:58:43.118] deployment.extensions/nginx-deployment-resources created
I1205 22:58:43.220] core.sh:1252: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
I1205 22:58:43.313] (Bcore.sh:1253: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1205 22:58:43.404] (Bcore.sh:1254: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I1205 22:58:43.498] (Bdeployment.extensions/nginx-deployment-resources resource requirements updated
I1205 22:58:43.602] core.sh:1257: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
... skipping 5 lines ...
W1205 22:58:43.987] I1205 22:58:43.127997   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-69c96fd869", UID:"502c0800-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1649", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-69c96fd869-jnjg7
W1205 22:58:43.987] I1205 22:58:43.502958   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources", UID:"502b5219-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1662", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c5996c457 to 1
W1205 22:58:43.988] I1205 22:58:43.505695   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-6c5996c457", UID:"50661388-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1663", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c5996c457-njwst
W1205 22:58:43.988] I1205 22:58:43.509089   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources", UID:"502b5219-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1662", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-69c96fd869 to 2
W1205 22:58:43.988] I1205 22:58:43.514361   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-69c96fd869", UID:"502c0800-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1669", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-69c96fd869-vc7m7
W1205 22:58:43.989] I1205 22:58:43.515717   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources", UID:"502b5219-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1665", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c5996c457 to 2
W1205 22:58:43.989] E1205 22:58:43.516327   55672 replica_set.go:450] Sync "namespace-1544050713-13687/nginx-deployment-resources-6c5996c457" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-resources-6c5996c457": the object has been modified; please apply your changes to the latest version and try again
W1205 22:58:43.989] I1205 22:58:43.519678   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-6c5996c457", UID:"50661388-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1673", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c5996c457-52757
W1205 22:58:43.989] error: unable to find container named redis
W1205 22:58:43.990] I1205 22:58:43.896684   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources", UID:"502b5219-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1686", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-69c96fd869 to 0
W1205 22:58:43.990] I1205 22:58:43.902861   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-69c96fd869", UID:"502c0800-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1690", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-69c96fd869-dh4q5
W1205 22:58:43.990] I1205 22:58:43.903750   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-69c96fd869", UID:"502c0800-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1690", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-69c96fd869-jnjg7
W1205 22:58:43.991] I1205 22:58:43.904153   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources", UID:"502b5219-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1689", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-5f4579485f to 2
W1205 22:58:43.991] I1205 22:58:43.907683   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-5f4579485f", UID:"50a11cbe-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-5f4579485f-ss27h
W1205 22:58:43.991] I1205 22:58:43.910739   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050713-13687", Name:"nginx-deployment-resources-5f4579485f", UID:"50a11cbe-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-5f4579485f-knrnz
... skipping 77 lines ...
I1205 22:58:44.600]     status: "False"
I1205 22:58:44.600]     type: Available
I1205 22:58:44.600]   observedGeneration: 4
I1205 22:58:44.600]   replicas: 4
I1205 22:58:44.600]   unavailableReplicas: 4
I1205 22:58:44.600]   updatedReplicas: 2
W1205 22:58:44.701] error: you must specify resources by --filename when --local is set.
W1205 22:58:44.701] Example resource specifications include:
W1205 22:58:44.701]    '-f rsrc.yaml'
W1205 22:58:44.701]    '--filename=rsrc.json'
I1205 22:58:44.802] core.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I1205 22:58:44.867] (Bcore.sh:1274: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I1205 22:58:44.965] (Bcore.sh:1275: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 44 lines ...
I1205 22:58:46.542]                 pod-template-hash=55c9b846cc
I1205 22:58:46.542] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I1205 22:58:46.542]                 deployment.kubernetes.io/max-replicas: 2
I1205 22:58:46.542]                 deployment.kubernetes.io/revision: 1
I1205 22:58:46.542] Controlled By:  Deployment/test-nginx-apps
I1205 22:58:46.542] Replicas:       1 current / 1 desired
I1205 22:58:46.542] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:46.542] Pod Template:
I1205 22:58:46.542]   Labels:  app=test-nginx-apps
I1205 22:58:46.542]            pod-template-hash=55c9b846cc
I1205 22:58:46.542]   Containers:
I1205 22:58:46.543]    nginx:
I1205 22:58:46.543]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 95 lines ...
I1205 22:58:50.662] (B    Image:	k8s.gcr.io/nginx:test-cmd
I1205 22:58:50.755] apps.sh:296: Successful get deployment.extensions {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I1205 22:58:50.860] (Bdeployment.extensions/nginx rolled back
I1205 22:58:51.947] apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1205 22:58:52.120] (Bapps.sh:303: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1205 22:58:52.213] (Bdeployment.extensions/nginx rolled back
W1205 22:58:52.313] error: unable to find specified revision 1000000 in history
I1205 22:58:53.303] apps.sh:307: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I1205 22:58:53.385] (Bdeployment.extensions/nginx paused
W1205 22:58:53.486] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
I1205 22:58:53.586] deployment.extensions/nginx resumed
I1205 22:58:53.667] deployment.extensions/nginx rolled back
I1205 22:58:53.828]     deployment.kubernetes.io/revision-history: 1,3
W1205 22:58:54.003] error: desired revision (3) is different from the running revision (5)
I1205 22:58:54.139] deployment.extensions/nginx2 created
I1205 22:58:54.214] deployment.extensions "nginx2" deleted
I1205 22:58:54.288] deployment.extensions "nginx" deleted
I1205 22:58:54.385] apps.sh:329: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:58:54.521] (Bdeployment.extensions/nginx-deployment created
I1205 22:58:54.611] apps.sh:332: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
... skipping 27 lines ...
W1205 22:58:56.705] I1205 22:58:54.530063   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"56f8022e-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1970", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-646d4f779d-xjqfm
W1205 22:58:56.705] I1205 22:58:54.530192   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"56f8022e-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1970", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-646d4f779d-wcxxm
W1205 22:58:56.705] I1205 22:58:54.857506   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"56f772eb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1983", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-85db47bbdb to 1
W1205 22:58:56.705] I1205 22:58:54.860679   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-85db47bbdb", UID:"572ac3e0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1984", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-85db47bbdb-nktkh
W1205 22:58:56.706] I1205 22:58:54.862772   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"56f772eb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1983", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-646d4f779d to 2
W1205 22:58:56.706] I1205 22:58:54.867177   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"56f8022e-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1988", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-dp7jd
W1205 22:58:56.706] E1205 22:58:54.868054   55672 replica_set.go:450] Sync "namespace-1544050725-16485/nginx-deployment-85db47bbdb" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-85db47bbdb": the object has been modified; please apply your changes to the latest version and try again
W1205 22:58:56.706] I1205 22:58:54.868620   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"56f772eb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1986", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-85db47bbdb to 2
W1205 22:58:56.707] I1205 22:58:54.878432   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-85db47bbdb", UID:"572ac3e0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2001", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-85db47bbdb-wlzch
W1205 22:58:56.707] error: unable to find container named "redis"
W1205 22:58:56.707] I1205 22:58:55.912770   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"56f772eb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2016", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-646d4f779d to 0
W1205 22:58:56.707] I1205 22:58:55.918506   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"56f772eb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2018", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-dc756cc6 to 2
W1205 22:58:56.707] I1205 22:58:55.918595   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"56f8022e-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-wcxxm
W1205 22:58:56.708] I1205 22:58:55.918816   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"56f8022e-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-xjqfm
W1205 22:58:56.708] I1205 22:58:55.921278   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-dc756cc6", UID:"57cb0ccb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2024", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-dc756cc6-bjxlb
W1205 22:58:56.708] I1205 22:58:55.923794   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-dc756cc6", UID:"57cb0ccb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2024", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-dc756cc6-hspsk
... skipping 54 lines ...
I1205 22:58:59.969] Namespace:    namespace-1544050738-15166
I1205 22:58:59.969] Selector:     app=guestbook,tier=frontend
I1205 22:58:59.969] Labels:       app=guestbook
I1205 22:58:59.969]               tier=frontend
I1205 22:58:59.969] Annotations:  <none>
I1205 22:58:59.969] Replicas:     3 current / 3 desired
I1205 22:58:59.969] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:58:59.969] Pod Template:
I1205 22:58:59.969]   Labels:  app=guestbook
I1205 22:58:59.970]            tier=frontend
I1205 22:58:59.970]   Containers:
I1205 22:58:59.970]    php-redis:
I1205 22:58:59.970]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I1205 22:59:00.069] Namespace:    namespace-1544050738-15166
I1205 22:59:00.069] Selector:     app=guestbook,tier=frontend
I1205 22:59:00.069] Labels:       app=guestbook
I1205 22:59:00.070]               tier=frontend
I1205 22:59:00.070] Annotations:  <none>
I1205 22:59:00.070] Replicas:     3 current / 3 desired
I1205 22:59:00.070] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:00.070] Pod Template:
I1205 22:59:00.070]   Labels:  app=guestbook
I1205 22:59:00.070]            tier=frontend
I1205 22:59:00.070]   Containers:
I1205 22:59:00.070]    php-redis:
I1205 22:59:00.071]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I1205 22:59:00.167] Namespace:    namespace-1544050738-15166
I1205 22:59:00.167] Selector:     app=guestbook,tier=frontend
I1205 22:59:00.167] Labels:       app=guestbook
I1205 22:59:00.167]               tier=frontend
I1205 22:59:00.167] Annotations:  <none>
I1205 22:59:00.167] Replicas:     3 current / 3 desired
I1205 22:59:00.167] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:00.167] Pod Template:
I1205 22:59:00.167]   Labels:  app=guestbook
I1205 22:59:00.168]            tier=frontend
I1205 22:59:00.168]   Containers:
I1205 22:59:00.168]    php-redis:
I1205 22:59:00.168]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I1205 22:59:00.267] Namespace:    namespace-1544050738-15166
I1205 22:59:00.267] Selector:     app=guestbook,tier=frontend
I1205 22:59:00.267] Labels:       app=guestbook
I1205 22:59:00.267]               tier=frontend
I1205 22:59:00.268] Annotations:  <none>
I1205 22:59:00.268] Replicas:     3 current / 3 desired
I1205 22:59:00.268] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:00.268] Pod Template:
I1205 22:59:00.268]   Labels:  app=guestbook
I1205 22:59:00.268]            tier=frontend
I1205 22:59:00.268]   Containers:
I1205 22:59:00.268]    php-redis:
I1205 22:59:00.268]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 16 lines ...
W1205 22:59:00.370] I1205 22:58:57.076969   55672 horizontal.go:309] Horizontal Pod Autoscaler frontend has been deleted in namespace-1544050713-13687
W1205 22:59:00.370] I1205 22:58:57.211600   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2071", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5b795689cd to 1
W1205 22:59:00.370] I1205 22:58:57.214162   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-5b795689cd", UID:"58920549-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2072", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5b795689cd-6wjhk
W1205 22:59:00.371] I1205 22:58:57.217686   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2071", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-646d4f779d to 2
W1205 22:59:00.371] I1205 22:58:57.222029   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"58359b92-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2078", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-r5q8m
W1205 22:59:00.371] I1205 22:58:57.226904   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2074", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5b795689cd to 2
W1205 22:59:00.372] E1205 22:58:57.229275   55672 replica_set.go:450] Sync "namespace-1544050725-16485/nginx-deployment-5b795689cd" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-5b795689cd": the object has been modified; please apply your changes to the latest version and try again
W1205 22:59:00.372] I1205 22:58:57.231220   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-5b795689cd", UID:"58920549-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2085", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5b795689cd-529nl
W1205 22:59:00.372] I1205 22:58:57.472613   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2096", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-646d4f779d to 0
W1205 22:59:00.373] I1205 22:58:57.477393   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"58359b92-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2100", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-h8mpq
W1205 22:59:00.373] I1205 22:58:57.477904   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-646d4f779d", UID:"58359b92-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2100", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-6n4ht
W1205 22:59:00.373] I1205 22:58:57.478280   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2098", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5766b7c95b to 2
W1205 22:59:00.373] I1205 22:58:57.480232   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-5766b7c95b", UID:"58b8f540-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2106", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5766b7c95b-7gb8g
W1205 22:59:00.374] I1205 22:58:57.544529   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-5766b7c95b", UID:"58b8f540-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2106", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5766b7c95b-x82c9
W1205 22:59:00.374] I1205 22:58:57.636852   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2115", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-5766b7c95b to 0
W1205 22:59:00.374] I1205 22:58:57.641804   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2117", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-794dcdf6bb to 2
W1205 22:59:00.375] I1205 22:58:57.721142   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2125", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-5b795689cd to 0
W1205 22:59:00.375] I1205 22:58:57.727062   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment", UID:"5834fec0-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2127", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-65b869c68c to 2
W1205 22:59:00.375] I1205 22:58:57.997304   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-5b795689cd", UID:"58920549-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2128", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-5b795689cd-6wjhk
W1205 22:59:00.375] I1205 22:58:58.046236   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050725-16485", Name:"nginx-deployment-5b795689cd", UID:"58920549-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2128", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-5b795689cd-529nl
W1205 22:59:00.376] E1205 22:58:58.094393   55672 replica_set.go:450] Sync "namespace-1544050725-16485/nginx-deployment-646d4f779d" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-646d4f779d": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1544050725-16485/nginx-deployment-646d4f779d, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 58359b92-f8e1-11e8-8d22-0242ac110002, UID in object meta: 
W1205 22:59:00.376] E1205 22:58:58.143524   55672 replica_set.go:450] Sync "namespace-1544050725-16485/nginx-deployment-794dcdf6bb" failed with replicasets.apps "nginx-deployment-794dcdf6bb" not found
W1205 22:59:00.377] E1205 22:58:58.194290   55672 replica_set.go:450] Sync "namespace-1544050725-16485/nginx-deployment-5766b7c95b" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-5766b7c95b": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1544050725-16485/nginx-deployment-5766b7c95b, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 58b8f540-f8e1-11e8-8d22-0242ac110002, UID in object meta: 
W1205 22:59:00.377] E1205 22:58:58.243979   55672 replica_set.go:450] Sync "namespace-1544050725-16485/nginx-deployment-65b869c68c" failed with replicasets.apps "nginx-deployment-65b869c68c" not found
W1205 22:59:00.377] E1205 22:58:58.444139   55672 replica_set.go:450] Sync "namespace-1544050725-16485/nginx-deployment-5b795689cd" failed with replicasets.apps "nginx-deployment-5b795689cd" not found
W1205 22:59:00.377] I1205 22:58:58.647888   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"596c5719-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2163", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-c5r2q
W1205 22:59:00.378] I1205 22:58:58.650214   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"596c5719-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2163", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bhvvj
W1205 22:59:00.378] I1205 22:58:58.650423   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"596c5719-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2163", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6fmqg
W1205 22:59:00.378] E1205 22:58:58.843956   55672 replica_set.go:450] Sync "namespace-1544050738-15166/frontend" failed with replicasets.apps "frontend" not found
W1205 22:59:00.378] I1205 22:58:59.039138   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend-no-cascade", UID:"59a7ee1f-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2177", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-mcxjz
W1205 22:59:00.379] I1205 22:58:59.041155   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend-no-cascade", UID:"59a7ee1f-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2177", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-rjntc
W1205 22:59:00.379] I1205 22:58:59.043888   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend-no-cascade", UID:"59a7ee1f-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2177", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-gblwm
W1205 22:59:00.379] E1205 22:58:59.293779   55672 replica_set.go:450] Sync "namespace-1544050738-15166/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
W1205 22:59:00.379] I1205 22:58:59.760861   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5a1680f7-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2196", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hl5ng
W1205 22:59:00.379] I1205 22:58:59.763218   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5a1680f7-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2196", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sg8hk
W1205 22:59:00.380] I1205 22:58:59.763258   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5a1680f7-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2196", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-k67c4
I1205 22:59:00.480] Successful describe rs:
I1205 22:59:00.480] Name:         frontend
I1205 22:59:00.481] Namespace:    namespace-1544050738-15166
I1205 22:59:00.481] Selector:     app=guestbook,tier=frontend
I1205 22:59:00.481] Labels:       app=guestbook
I1205 22:59:00.481]               tier=frontend
I1205 22:59:00.481] Annotations:  <none>
I1205 22:59:00.481] Replicas:     3 current / 3 desired
I1205 22:59:00.481] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:00.482] Pod Template:
I1205 22:59:00.482]   Labels:  app=guestbook
I1205 22:59:00.482]            tier=frontend
I1205 22:59:00.482]   Containers:
I1205 22:59:00.482]    php-redis:
I1205 22:59:00.482]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I1205 22:59:00.492] Namespace:    namespace-1544050738-15166
I1205 22:59:00.492] Selector:     app=guestbook,tier=frontend
I1205 22:59:00.492] Labels:       app=guestbook
I1205 22:59:00.492]               tier=frontend
I1205 22:59:00.492] Annotations:  <none>
I1205 22:59:00.492] Replicas:     3 current / 3 desired
I1205 22:59:00.492] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:00.492] Pod Template:
I1205 22:59:00.492]   Labels:  app=guestbook
I1205 22:59:00.492]            tier=frontend
I1205 22:59:00.492]   Containers:
I1205 22:59:00.493]    php-redis:
I1205 22:59:00.493]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I1205 22:59:00.587] Namespace:    namespace-1544050738-15166
I1205 22:59:00.587] Selector:     app=guestbook,tier=frontend
I1205 22:59:00.587] Labels:       app=guestbook
I1205 22:59:00.587]               tier=frontend
I1205 22:59:00.587] Annotations:  <none>
I1205 22:59:00.587] Replicas:     3 current / 3 desired
I1205 22:59:00.587] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:00.587] Pod Template:
I1205 22:59:00.587]   Labels:  app=guestbook
I1205 22:59:00.588]            tier=frontend
I1205 22:59:00.588]   Containers:
I1205 22:59:00.588]    php-redis:
I1205 22:59:00.588]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I1205 22:59:00.686] Namespace:    namespace-1544050738-15166
I1205 22:59:00.686] Selector:     app=guestbook,tier=frontend
I1205 22:59:00.686] Labels:       app=guestbook
I1205 22:59:00.686]               tier=frontend
I1205 22:59:00.686] Annotations:  <none>
I1205 22:59:00.686] Replicas:     3 current / 3 desired
I1205 22:59:00.687] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:00.687] Pod Template:
I1205 22:59:00.687]   Labels:  app=guestbook
I1205 22:59:00.687]            tier=frontend
I1205 22:59:00.687]   Containers:
I1205 22:59:00.687]    php-redis:
I1205 22:59:00.687]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 137 lines ...
W1205 22:59:02.684] I1205 22:59:02.157390   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"scale-1-9bdb56f49", UID:"5af04913-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2263", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-9bdb56f49-v2rj9
W1205 22:59:02.684] I1205 22:59:02.164451   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050738-15166", Name:"scale-2", UID:"5b064dcb-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2270", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-9bdb56f49 to 3
W1205 22:59:02.685] I1205 22:59:02.165656   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"scale-2-9bdb56f49", UID:"5b06dc44-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2272", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-9bdb56f49-xjfww
W1205 22:59:02.685] I1205 22:59:02.178046   55672 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544050738-15166", Name:"scale-3", UID:"5b1bf868-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2280", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-9bdb56f49 to 3
W1205 22:59:02.685] I1205 22:59:02.344940   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"scale-3-9bdb56f49", UID:"5b1c855b-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2281", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-9bdb56f49-qmx4l
W1205 22:59:02.685] I1205 22:59:02.446085   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"scale-3-9bdb56f49", UID:"5b1c855b-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2281", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-9bdb56f49-rjm2d
W1205 22:59:02.695] E1205 22:59:02.694345   55672 replica_set.go:450] Sync "namespace-1544050738-15166/scale-3-9bdb56f49" failed with replicasets.apps "scale-3-9bdb56f49" not found
W1205 22:59:02.745] I1205 22:59:02.745255   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5bdce62f-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2322", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-vsd48
W1205 22:59:02.846] I1205 22:59:02.845426   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5bdce62f-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2322", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tz4s8
W1205 22:59:02.896] I1205 22:59:02.895739   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5bdce62f-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2322", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xf4dz
I1205 22:59:02.996] replicaset.extensions/frontend created
I1205 22:59:02.997] apps.sh:587: Successful get rs frontend {{.spec.replicas}}: 3
I1205 22:59:02.997] (Bservice/frontend exposed
... skipping 35 lines ...
I1205 22:59:05.438] horizontalpodautoscaler.autoscaling/frontend autoscaled
I1205 22:59:05.523] apps.sh:647: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I1205 22:59:05.597] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W1205 22:59:05.698] I1205 22:59:05.030652   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5d3a7249-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2385", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g67h5
W1205 22:59:05.698] I1205 22:59:05.032755   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5d3a7249-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2385", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-b6pqv
W1205 22:59:05.699] I1205 22:59:05.033373   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544050738-15166", Name:"frontend", UID:"5d3a7249-f8e1-11e8-8d22-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2385", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-m2wj7
W1205 22:59:05.699] Error: required flag(s) "max" not set
W1205 22:59:05.699] 
W1205 22:59:05.699] 
W1205 22:59:05.699] Examples:
W1205 22:59:05.700]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W1205 22:59:05.700]   kubectl autoscale deployment foo --min=2 --max=10
W1205 22:59:05.700]   
... skipping 85 lines ...
I1205 22:59:08.405] (Bapps.sh:431: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I1205 22:59:08.491] (Bapps.sh:432: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I1205 22:59:08.589] (Bstatefulset.apps/nginx rolled back
I1205 22:59:08.678] apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I1205 22:59:08.763] (Bapps.sh:436: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1205 22:59:08.861] (BSuccessful
I1205 22:59:08.862] message:error: unable to find specified revision 1000000 in history
I1205 22:59:08.862] has:unable to find specified revision
I1205 22:59:08.945] apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I1205 22:59:09.029] (Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1205 22:59:09.120] (Bstatefulset.apps/nginx rolled back
I1205 22:59:09.207] apps.sh:444: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I1205 22:59:09.290] (Bapps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 61 lines ...
I1205 22:59:10.945] Name:         mock
I1205 22:59:10.945] Namespace:    namespace-1544050750-31209
I1205 22:59:10.945] Selector:     app=mock
I1205 22:59:10.945] Labels:       app=mock
I1205 22:59:10.945] Annotations:  <none>
I1205 22:59:10.945] Replicas:     1 current / 1 desired
I1205 22:59:10.945] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:10.945] Pod Template:
I1205 22:59:10.945]   Labels:  app=mock
I1205 22:59:10.945]   Containers:
I1205 22:59:10.945]    mock-container:
I1205 22:59:10.945]     Image:        k8s.gcr.io/pause:2.0
I1205 22:59:10.945]     Port:         9949/TCP
... skipping 56 lines ...
I1205 22:59:12.968] Name:         mock
I1205 22:59:12.968] Namespace:    namespace-1544050750-31209
I1205 22:59:12.968] Selector:     app=mock
I1205 22:59:12.968] Labels:       app=mock
I1205 22:59:12.968] Annotations:  <none>
I1205 22:59:12.969] Replicas:     1 current / 1 desired
I1205 22:59:12.969] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:12.969] Pod Template:
I1205 22:59:12.969]   Labels:  app=mock
I1205 22:59:12.969]   Containers:
I1205 22:59:12.969]    mock-container:
I1205 22:59:12.969]     Image:        k8s.gcr.io/pause:2.0
I1205 22:59:12.969]     Port:         9949/TCP
... skipping 56 lines ...
I1205 22:59:14.931] Name:         mock
I1205 22:59:14.931] Namespace:    namespace-1544050750-31209
I1205 22:59:14.931] Selector:     app=mock
I1205 22:59:14.931] Labels:       app=mock
I1205 22:59:14.931] Annotations:  <none>
I1205 22:59:14.931] Replicas:     1 current / 1 desired
I1205 22:59:14.932] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:14.932] Pod Template:
I1205 22:59:14.932]   Labels:  app=mock
I1205 22:59:14.932]   Containers:
I1205 22:59:14.932]    mock-container:
I1205 22:59:14.932]     Image:        k8s.gcr.io/pause:2.0
I1205 22:59:14.932]     Port:         9949/TCP
... skipping 42 lines ...
I1205 22:59:16.859] Namespace:    namespace-1544050750-31209
I1205 22:59:16.859] Selector:     app=mock
I1205 22:59:16.859] Labels:       app=mock
I1205 22:59:16.859]               status=replaced
I1205 22:59:16.859] Annotations:  <none>
I1205 22:59:16.859] Replicas:     1 current / 1 desired
I1205 22:59:16.859] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:16.860] Pod Template:
I1205 22:59:16.860]   Labels:  app=mock
I1205 22:59:16.860]   Containers:
I1205 22:59:16.860]    mock-container:
I1205 22:59:16.860]     Image:        k8s.gcr.io/pause:2.0
I1205 22:59:16.860]     Port:         9949/TCP
... skipping 11 lines ...
I1205 22:59:16.861] Namespace:    namespace-1544050750-31209
I1205 22:59:16.861] Selector:     app=mock2
I1205 22:59:16.861] Labels:       app=mock2
I1205 22:59:16.861]               status=replaced
I1205 22:59:16.861] Annotations:  <none>
I1205 22:59:16.861] Replicas:     1 current / 1 desired
I1205 22:59:16.861] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1205 22:59:16.861] Pod Template:
I1205 22:59:16.861]   Labels:  app=mock2
I1205 22:59:16.861]   Containers:
I1205 22:59:16.862]    mock-container:
I1205 22:59:16.862]     Image:        k8s.gcr.io/pause:2.0
I1205 22:59:16.862]     Port:         9949/TCP
... skipping 107 lines ...
I1205 22:59:21.504] storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:59:21.649] (Bpersistentvolume/pv0001 created
I1205 22:59:21.740] storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
I1205 22:59:21.814] (Bpersistentvolume "pv0001" deleted
W1205 22:59:21.915] I1205 22:59:20.191735   55672 horizontal.go:309] Horizontal Pod Autoscaler frontend has been deleted in namespace-1544050738-15166
W1205 22:59:21.915] I1205 22:59:20.617022   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050750-31209", Name:"mock", UID:"6684f535-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2653", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-4c749
W1205 22:59:21.915] E1205 22:59:21.653989   55672 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
W1205 22:59:21.962] E1205 22:59:21.962297   55672 pv_protection_controller.go:116] PV pv0002 failed with : Operation cannot be fulfilled on persistentvolumes "pv0002": the object has been modified; please apply your changes to the latest version and try again
I1205 22:59:22.063] persistentvolume/pv0002 created
I1205 22:59:22.063] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I1205 22:59:22.122] (Bpersistentvolume "pv0002" deleted
I1205 22:59:22.262] persistentvolume/pv0003 created
I1205 22:59:22.354] storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
I1205 22:59:22.428] (Bpersistentvolume "pv0003" deleted
... skipping 10 lines ...
I1205 22:59:22.739] Context "test" modified.
I1205 22:59:22.744] +++ [1205 22:59:22] Testing persistent volumes claims
I1205 22:59:22.825] storage.sh:57: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
I1205 22:59:22.970] (Bpersistentvolumeclaim/myclaim-1 created
I1205 22:59:23.062] storage.sh:60: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
I1205 22:59:23.135] (Bpersistentvolumeclaim "myclaim-1" deleted
W1205 22:59:23.236] E1205 22:59:22.264239   55672 pv_protection_controller.go:116] PV pv0003 failed with : Operation cannot be fulfilled on persistentvolumes "pv0003": the object has been modified; please apply your changes to the latest version and try again
W1205 22:59:23.236] I1205 22:59:22.970641   55672 event.go:221] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1544050762-11740", Name:"myclaim-1", UID:"67ec8929-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2684", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W1205 22:59:23.237] I1205 22:59:22.973036   55672 event.go:221] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1544050762-11740", Name:"myclaim-1", UID:"67ec8929-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2686", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W1205 22:59:23.237] I1205 22:59:23.135714   55672 event.go:221] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1544050762-11740", Name:"myclaim-1", UID:"67ec8929-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2688", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W1205 22:59:23.281] I1205 22:59:23.280817   55672 event.go:221] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1544050762-11740", Name:"myclaim-2", UID:"681bcfee-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2691", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W1205 22:59:23.284] I1205 22:59:23.283739   55672 event.go:221] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1544050762-11740", Name:"myclaim-2", UID:"681bcfee-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2693", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I1205 22:59:23.384] persistentvolumeclaim/myclaim-2 created
... skipping 456 lines ...
I1205 22:59:26.676] yes
I1205 22:59:26.676] has:the server doesn't have a resource type
I1205 22:59:26.746] Successful
I1205 22:59:26.746] message:yes
I1205 22:59:26.746] has:yes
I1205 22:59:26.813] Successful
I1205 22:59:26.813] message:error: --subresource can not be used with NonResourceURL
I1205 22:59:26.813] has:subresource can not be used with NonResourceURL
I1205 22:59:26.886] Successful
I1205 22:59:26.960] Successful
I1205 22:59:26.960] message:yes
I1205 22:59:26.960] 0
I1205 22:59:26.960] has:0
... skipping 6 lines ...
I1205 22:59:27.135] role.rbac.authorization.k8s.io/testing-R reconciled
I1205 22:59:27.222] legacy-script.sh:736: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I1205 22:59:27.305] (Blegacy-script.sh:737: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I1205 22:59:27.390] (Blegacy-script.sh:738: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I1205 22:59:27.479] (Blegacy-script.sh:739: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I1205 22:59:27.554] (BSuccessful
I1205 22:59:27.555] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I1205 22:59:27.555] has:only rbac.authorization.k8s.io/v1 is supported
I1205 22:59:27.639] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I1205 22:59:27.645] role.rbac.authorization.k8s.io "testing-R" deleted
I1205 22:59:27.653] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
I1205 22:59:27.659] clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
I1205 22:59:27.669] Recording: run_retrieve_multiple_tests
... skipping 32 lines ...
I1205 22:59:28.695] +++ Running case: test-cmd.run_kubectl_explain_tests 
I1205 22:59:28.697] +++ working dir: /go/src/k8s.io/kubernetes
I1205 22:59:28.700] +++ command: run_kubectl_explain_tests
I1205 22:59:28.708] +++ [1205 22:59:28] Testing kubectl(v1:explain)
W1205 22:59:28.809] I1205 22:59:28.588414   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050767-22402", Name:"cassandra", UID:"6b0be4c0-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2733", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-gxqcq
W1205 22:59:28.809] I1205 22:59:28.594158   55672 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544050767-22402", Name:"cassandra", UID:"6b0be4c0-f8e1-11e8-8d22-0242ac110002", APIVersion:"v1", ResourceVersion:"2741", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-jhkn7
W1205 22:59:28.809] E1205 22:59:28.599558   55672 replica_set.go:450] Sync "namespace-1544050767-22402/cassandra" failed with replicationcontrollers "cassandra" not found
I1205 22:59:28.910] KIND:     Pod
I1205 22:59:28.910] VERSION:  v1
I1205 22:59:28.910] 
I1205 22:59:28.910] DESCRIPTION:
I1205 22:59:28.910]      Pod is a collection of containers that can run on a host. This resource is
I1205 22:59:28.910]      created by clients and scheduled onto hosts.
... skipping 849 lines ...
I1205 22:59:52.816] message:node/127.0.0.1 already uncordoned (dry run)
I1205 22:59:52.816] has:already uncordoned
I1205 22:59:52.893] node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
I1205 22:59:52.964] (Bnode/127.0.0.1 labeled
I1205 22:59:53.048] node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
I1205 22:59:53.108] (BSuccessful
I1205 22:59:53.109] message:error: cannot specify both a node name and a --selector option
I1205 22:59:53.109] See 'kubectl drain -h' for help and examples
I1205 22:59:53.109] has:cannot specify both a node name
I1205 22:59:53.169] Successful
I1205 22:59:53.170] message:error: USAGE: cordon NODE [flags]
I1205 22:59:53.170] See 'kubectl cordon -h' for help and examples
I1205 22:59:53.170] has:error\: USAGE\: cordon NODE
I1205 22:59:53.236] node/127.0.0.1 already uncordoned
I1205 22:59:53.302] Successful
I1205 22:59:53.302] message:error: You must provide one or more resources by argument or filename.
I1205 22:59:53.302] Example resource specifications include:
I1205 22:59:53.303]    '-f rsrc.yaml'
I1205 22:59:53.303]    '--filename=rsrc.json'
I1205 22:59:53.303]    '<resource> <name>'
I1205 22:59:53.303]    '<resource>'
I1205 22:59:53.303] has:must provide one or more resources
... skipping 15 lines ...
I1205 22:59:53.681] Successful
I1205 22:59:53.681] message:The following kubectl-compatible plugins are available:
I1205 22:59:53.681] 
I1205 22:59:53.681] test/fixtures/pkg/kubectl/plugins/version/kubectl-version
I1205 22:59:53.681]   - warning: kubectl-version overwrites existing command: "kubectl version"
I1205 22:59:53.681] 
I1205 22:59:53.681] error: one plugin warning was found
I1205 22:59:53.682] has:kubectl-version overwrites existing command: "kubectl version"
I1205 22:59:53.743] Successful
I1205 22:59:53.743] message:The following kubectl-compatible plugins are available:
I1205 22:59:53.743] 
I1205 22:59:53.743] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I1205 22:59:53.743] test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
I1205 22:59:53.743]   - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
I1205 22:59:53.743] 
I1205 22:59:53.744] error: one plugin warning was found
I1205 22:59:53.744] has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
I1205 22:59:53.806] Successful
I1205 22:59:53.806] message:The following kubectl-compatible plugins are available:
I1205 22:59:53.806] 
I1205 22:59:53.806] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I1205 22:59:53.806] has:plugins are available
I1205 22:59:53.869] Successful
I1205 22:59:53.870] message:
I1205 22:59:53.870] error: unable to read directory "test/fixtures/pkg/kubectl/plugins/empty" in your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory
I1205 22:59:53.870] error: unable to find any kubectl plugins in your PATH
I1205 22:59:53.870] has:unable to find any kubectl plugins in your PATH
I1205 22:59:53.930] Successful
I1205 22:59:53.931] message:I am plugin foo
I1205 22:59:53.931] has:plugin foo
I1205 22:59:53.995] Successful
I1205 22:59:53.996] message:Client Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0-alpha.0.854+809eaa70251197", GitCommit:"809eaa7025119712ca82c6f4dfa73a4a544ad7ec", GitTreeState:"clean", BuildDate:"2018-12-05T22:53:32Z", GoVersion:"go1.11.1", Compiler:"gc", Platform:"linux/amd64"}
... skipping 9 lines ...
I1205 22:59:54.059] 
I1205 22:59:54.060] +++ Running case: test-cmd.run_impersonation_tests 
I1205 22:59:54.063] +++ working dir: /go/src/k8s.io/kubernetes
I1205 22:59:54.065] +++ command: run_impersonation_tests
I1205 22:59:54.073] +++ [1205 22:59:54] Testing impersonation
I1205 22:59:54.135] Successful
I1205 22:59:54.135] message:error: requesting groups or user-extra for  without impersonating a user
I1205 22:59:54.135] has:without impersonating a user
I1205 22:59:54.267] certificatesigningrequest.certificates.k8s.io/foo created
I1205 22:59:54.349] authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
I1205 22:59:54.427] (Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
I1205 22:59:54.499] (Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
I1205 22:59:54.639] certificatesigningrequest.certificates.k8s.io/foo created
... skipping 33 lines ...
W1205 22:59:55.212] I1205 22:59:55.074724   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.212] I1205 22:59:55.074739   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.212] I1205 22:59:55.074843   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.212] I1205 22:59:55.074852   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.212] I1205 22:59:55.074898   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.212] I1205 22:59:55.074910   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.212] W1205 22:59:55.074984   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.213] W1205 22:59:55.075187   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.213] W1205 22:59:55.075229   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.213] I1205 22:59:55.075581   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.213] I1205 22:59:55.075600   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.213] I1205 22:59:55.075625   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.213] I1205 22:59:55.075634   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.214] I1205 22:59:55.075658   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.214] I1205 22:59:55.075664   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 8 lines ...
W1205 22:59:55.215] I1205 22:59:55.075906   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.215] I1205 22:59:55.075915   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.215] I1205 22:59:55.075946   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.215] I1205 22:59:55.075981   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.215] I1205 22:59:55.075990   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.215] I1205 22:59:55.075917   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.216] W1205 22:59:55.074333   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.216] I1205 22:59:55.076385   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.216] I1205 22:59:55.076410   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.216] I1205 22:59:55.076446   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.216] I1205 22:59:55.076455   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.216] I1205 22:59:55.076490   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.216] I1205 22:59:55.076500   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 10 lines ...
W1205 22:59:55.218] I1205 22:59:55.076633   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.218] I1205 22:59:55.076643   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.218] I1205 22:59:55.076714   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.218] I1205 22:59:55.076722   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.218] I1205 22:59:55.076726   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.218] I1205 22:59:55.076746   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.218] W1205 22:59:55.076761   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.219] I1205 22:59:55.076781   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.219] I1205 22:59:55.076792   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.219] I1205 22:59:55.076825   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.219] I1205 22:59:55.076837   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.219] W1205 22:59:55.076762   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.219] I1205 22:59:55.076868   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.220] I1205 22:59:55.076877   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.220] I1205 22:59:55.076921   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.220] I1205 22:59:55.076928   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.220] I1205 22:59:55.077062   52321 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W1205 22:59:55.220] W1205 22:59:55.077066   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.220] I1205 22:59:55.077093   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.220] I1205 22:59:55.077124   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.220] I1205 22:59:55.077139   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.221] I1205 22:59:55.077150   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.221] I1205 22:59:55.077170   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.221] I1205 22:59:55.077184   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 10 lines ...
W1205 22:59:55.222] I1205 22:59:55.077386   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.222] I1205 22:59:55.077398   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.222] I1205 22:59:55.077478   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.223] I1205 22:59:55.077491   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.223] I1205 22:59:55.077504   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.223] I1205 22:59:55.077516   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.223] W1205 22:59:55.077530   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.223] W1205 22:59:55.077543   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.223] I1205 22:59:55.074315   52321 secure_serving.go:156] Stopped listening on 127.0.0.1:6443
W1205 22:59:55.223] I1205 22:59:55.077678   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.224] I1205 22:59:55.077696   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.224] I1205 22:59:55.077719   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.224] I1205 22:59:55.077739   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.224] I1205 22:59:55.077742   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.224] I1205 22:59:55.077723   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.224] I1205 22:59:55.077755   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.224] I1205 22:59:55.077758   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.225] W1205 22:59:55.077784   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.225] W1205 22:59:55.077857   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.225] W1205 22:59:55.077957   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.225] W1205 22:59:55.077982   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.225] W1205 22:59:55.077993   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.226] W1205 22:59:55.078016   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.226] W1205 22:59:55.078020   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.226] W1205 22:59:55.078039   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.226] W1205 22:59:55.077791   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.226] W1205 22:59:55.078051   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.227] W1205 22:59:55.077996   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.227] W1205 22:59:55.078018   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.227] W1205 22:59:55.078078   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.227] W1205 22:59:55.077958   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.227] W1205 22:59:55.077807   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.227] W1205 22:59:55.078092   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.228] W1205 22:59:55.077834   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.228] W1205 22:59:55.077845   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.228] W1205 22:59:55.077849   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.228] W1205 22:59:55.077876   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.228] W1205 22:59:55.078161   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.229] W1205 22:59:55.077880   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.229] W1205 22:59:55.078174   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.229] W1205 22:59:55.077902   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.229] W1205 22:59:55.077908   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.229] W1205 22:59:55.078202   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.230] W1205 22:59:55.078233   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.230] W1205 22:59:55.078264   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.230] W1205 22:59:55.077825   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.230] W1205 22:59:55.077913   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.230] W1205 22:59:55.077967   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.231] W1205 22:59:55.077804   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.231] W1205 22:59:55.078062   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.231] W1205 22:59:55.078370   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.231] W1205 22:59:55.078402   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.231] I1205 22:59:55.079399   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.231] I1205 22:59:55.078439   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.232] I1205 22:59:55.079425   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.232] I1205 22:59:55.078036   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.232] W1205 22:59:55.078525   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.232] I1205 22:59:55.078581   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.232] I1205 22:59:55.079515   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.232] I1205 22:59:55.078606   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.232] I1205 22:59:55.079549   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.233] W1205 22:59:55.078663   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.233] I1205 22:59:55.078806   52321 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W1205 22:59:55.233] I1205 22:59:55.078853   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.233] I1205 22:59:55.079574   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.233] I1205 22:59:55.078874   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.233] I1205 22:59:55.079588   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.233] I1205 22:59:55.078875   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 17 lines ...
W1205 22:59:55.236] I1205 22:59:55.079040   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.236] I1205 22:59:55.079802   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.236] I1205 22:59:55.079088   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.236] I1205 22:59:55.079090   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.236] I1205 22:59:55.079133   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.236] I1205 22:59:55.079872   52321 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1205 22:59:55.236] W1205 22:59:55.079171   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.237] W1205 22:59:55.079167   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.237] W1205 22:59:55.079204   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.237] W1205 22:59:55.079204   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.237] W1205 22:59:55.079227   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.237] W1205 22:59:55.079233   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.238] W1205 22:59:55.079239   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.238] W1205 22:59:55.079243   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.238] W1205 22:59:55.079253   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.238] W1205 22:59:55.079276   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.238] W1205 22:59:55.079280   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.239] W1205 22:59:55.079283   52321 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1205 22:59:55.239] E1205 22:59:55.079290   52321 controller.go:172] Get https://127.0.0.1:6443/api/v1/namespaces/default/endpoints/kubernetes: dial tcp 127.0.0.1:6443: connect: connection refused
W1205 22:59:55.239] + make test-integration
I1205 22:59:58.988] +++ [1205 22:59:58] Checking etcd is on PATH
I1205 22:59:58.989] /workspace/kubernetes/third_party/etcd/etcd
I1205 22:59:58.992] +++ [1205 22:59:58] Starting etcd instance
I1205 22:59:59.030] etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.kGuRbbtjD0 --listen-client-urls http://127.0.0.1:2379 --debug > "/workspace/artifacts/etcd.c21072b242f3.root.log.DEBUG.20181205-225959.94212" 2>/dev/null
I1205 22:59:59.031] Waiting for etcd to come up.
I1205 22:59:59.310] +++ [1205 22:59:59] On try 2, etcd: : http://127.0.0.1:2379
I1205 22:59:59.319] {"action":"set","node":{"key":"/_test","value":"","modifiedIndex":4,"createdIndex":4}}
I1205 22:59:59.322] +++ [1205 22:59:59] Running integration test cases
I1205 23:00:03.274] Running tests for APIVersion: v1,admissionregistration.k8s.io/v1alpha1,admissionregistration.k8s.io/v1beta1,admission.k8s.io/v1beta1,apps/v1beta1,apps/v1beta2,apps/v1,auditregistration.k8s.io/v1alpha1,authentication.k8s.io/v1,authentication.k8s.io/v1beta1,authorization.k8s.io/v1,authorization.k8s.io/v1beta1,autoscaling/v1,autoscaling/v2beta1,autoscaling/v2beta2,batch/v1,batch/v1beta1,batch/v2alpha1,certificates.k8s.io/v1beta1,coordination.k8s.io/v1beta1,extensions/v1beta1,events.k8s.io/v1beta1,imagepolicy.k8s.io/v1alpha1,networking.k8s.io/v1,policy/v1beta1,rbac.authorization.k8s.io/v1,rbac.authorization.k8s.io/v1beta1,rbac.authorization.k8s.io/v1alpha1,scheduling.k8s.io/v1alpha1,scheduling.k8s.io/v1beta1,settings.k8s.io/v1alpha1,storage.k8s.io/v1beta1,storage.k8s.io/v1,storage.k8s.io/v1alpha1,
I1205 23:00:03.307] +++ [1205 23:00:03] Running tests without code coverage
I1205 23:03:30.957] ok  	k8s.io/kubernetes/test/integration/apimachinery	168.612s
I1205 23:03:30.958] FAIL	k8s.io/kubernetes/test/integration/apiserver	37.574s
I1205 23:03:30.958] [restful] 2018/12/05 23:02:22 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:38285/swaggerapi
I1205 23:03:30.959] [restful] 2018/12/05 23:02:22 log.go:33: [restful/swagger] https://127.0.0.1:38285/swaggerui/ is mapped to folder /swagger-ui/
I1205 23:03:30.959] [restful] 2018/12/05 23:02:24 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:38285/swaggerapi
I1205 23:03:30.959] [restful] 2018/12/05 23:02:24 log.go:33: [restful/swagger] https://127.0.0.1:38285/swaggerui/ is mapped to folder /swagger-ui/
I1205 23:03:30.960] ok  	k8s.io/kubernetes/test/integration/auth	94.344s
I1205 23:03:30.960] [restful] 2018/12/05 23:01:17 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:42781/swaggerapi
... skipping 229 lines ...
I1205 23:12:22.679] [restful] 2018/12/05 23:05:35 log.go:33: [restful/swagger] https://127.0.0.1:36603/swaggerui/ is mapped to folder /swagger-ui/
I1205 23:12:22.679] ok  	k8s.io/kubernetes/test/integration/tls	12.478s
I1205 23:12:22.679] ok  	k8s.io/kubernetes/test/integration/ttlcontroller	10.898s
I1205 23:12:22.679] ok  	k8s.io/kubernetes/test/integration/volume	91.697s
I1205 23:12:22.679] ok  	k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration	143.398s
I1205 23:12:24.071] +++ [1205 23:12:24] Saved JUnit XML test report to /workspace/artifacts/junit_f5a444384056ebac4f2929ce7b7920ea9733ca19_20181205-230003.xml
I1205 23:12:24.074] Makefile:184: recipe for target 'test' failed
I1205 23:12:24.083] +++ [1205 23:12:24] Cleaning up etcd
W1205 23:12:24.183] make[1]: *** [test] Error 1
W1205 23:12:24.184] !!! [1205 23:12:24] Call tree:
W1205 23:12:24.184] !!! [1205 23:12:24]  1: hack/make-rules/test-integration.sh:105 runTests(...)
W1205 23:12:24.263] make: *** [test-integration] Error 1
I1205 23:12:24.363] +++ [1205 23:12:24] Integration test cleanup complete
I1205 23:12:24.364] Makefile:203: recipe for target 'test-integration' failed
W1205 23:12:25.312] Traceback (most recent call last):
W1205 23:12:25.312]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 167, in <module>
W1205 23:12:25.313]     main(ARGS.branch, ARGS.script, ARGS.force, ARGS.prow)
W1205 23:12:25.313]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 136, in main
W1205 23:12:25.313]     check(*cmd)
W1205 23:12:25.313]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W1205 23:12:25.313]     subprocess.check_call(cmd)
W1205 23:12:25.313]   File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
W1205 23:12:25.338]     raise CalledProcessError(retcode, cmd)
W1205 23:12:25.340] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=y', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.13-v20181105-ceed87206', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E1205 23:12:25.345] Command failed
I1205 23:12:25.345] process 537 exited with code 1 after 24.2m
E1205 23:12:25.345] FAIL: ci-kubernetes-integration-master
I1205 23:12:25.346] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W1205 23:12:25.788] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I1205 23:12:25.839] process 123825 exited with code 0 after 0.0m
I1205 23:12:25.839] Call:  gcloud config get-value account
I1205 23:12:26.089] process 123838 exited with code 0 after 0.0m
I1205 23:12:26.089] Will upload results to gs://kubernetes-jenkins/logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I1205 23:12:26.089] Upload result and artifacts...
I1205 23:12:26.090] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/logs/ci-kubernetes-integration-master/7142
I1205 23:12:26.090] Call:  gsutil ls gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/7142/artifacts
W1205 23:12:27.848] CommandException: One or more URLs matched no objects.
E1205 23:12:28.031] Command failed
I1205 23:12:28.031] process 123851 exited with code 1 after 0.0m
W1205 23:12:28.031] Remote dir gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/7142/artifacts not exist yet
I1205 23:12:28.031] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/7142/artifacts
I1205 23:12:31.892] process 123996 exited with code 0 after 0.1m
W1205 23:12:31.892] metadata path /workspace/_artifacts/metadata.json does not exist
W1205 23:12:31.893] metadata not found or invalid, init with empty metadata
... skipping 15 lines ...